Nov 24 09:03:41 crc systemd[1]: Starting Kubernetes Kubelet... Nov 24 09:03:42 crc restorecon[4562]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 09:03:42 crc restorecon[4562]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 24 09:03:42 crc restorecon[4562]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 24 09:03:42 crc kubenswrapper[4563]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 09:03:42 crc kubenswrapper[4563]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 24 09:03:42 crc kubenswrapper[4563]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 09:03:42 crc kubenswrapper[4563]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 09:03:42 crc kubenswrapper[4563]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 24 09:03:42 crc kubenswrapper[4563]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.920854 4563 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924908 4563 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924926 4563 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924930 4563 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924935 4563 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924942 4563 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924946 4563 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924950 4563 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924955 4563 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924958 4563 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924962 4563 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924966 4563 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924971 4563 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924975 4563 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924980 4563 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924984 4563 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924987 4563 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924995 4563 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.924998 4563 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925002 4563 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925006 4563 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925011 4563 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925016 4563 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925020 4563 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925024 4563 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925027 4563 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925031 4563 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925034 4563 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925038 4563 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925043 4563 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925047 4563 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925051 4563 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925054 4563 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925058 4563 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925062 4563 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925066 4563 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925069 4563 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925074 4563 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925077 4563 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925080 4563 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925084 4563 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925090 4563 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925093 4563 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925096 4563 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925100 4563 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925103 4563 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925107 4563 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925110 4563 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925113 4563 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925116 4563 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925119 4563 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925123 4563 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925126 4563 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925129 4563 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925135 4563 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925138 4563 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925141 4563 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925145 4563 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925148 4563 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925151 4563 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925154 4563 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925161 4563 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925175 4563 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925179 4563 feature_gate.go:330] unrecognized feature gate: Example Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925183 4563 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925190 4563 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925194 4563 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925199 4563 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925203 4563 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925207 4563 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925212 4563 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.925216 4563 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926135 4563 flags.go:64] FLAG: --address="0.0.0.0" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926335 4563 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926348 4563 flags.go:64] FLAG: --anonymous-auth="true" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926354 4563 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926361 4563 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926365 4563 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926371 4563 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926376 4563 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926380 4563 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926384 4563 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926388 4563 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926392 4563 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926396 4563 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926400 4563 flags.go:64] FLAG: --cgroup-root="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926403 4563 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926407 4563 flags.go:64] FLAG: --client-ca-file="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926410 4563 flags.go:64] FLAG: --cloud-config="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926414 4563 flags.go:64] FLAG: --cloud-provider="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926418 4563 flags.go:64] FLAG: --cluster-dns="[]" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926424 4563 flags.go:64] FLAG: --cluster-domain="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926428 4563 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926432 4563 flags.go:64] FLAG: --config-dir="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926435 4563 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926439 4563 flags.go:64] FLAG: --container-log-max-files="5" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926460 4563 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926464 4563 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926467 4563 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926472 4563 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926475 4563 flags.go:64] FLAG: --contention-profiling="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926479 4563 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926482 4563 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926486 4563 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926491 4563 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926497 4563 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926500 4563 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926504 4563 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926508 4563 flags.go:64] FLAG: --enable-load-reader="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926512 4563 flags.go:64] FLAG: --enable-server="true" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926516 4563 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926521 4563 flags.go:64] FLAG: --event-burst="100" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926525 4563 flags.go:64] FLAG: --event-qps="50" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926528 4563 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926532 4563 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926536 4563 flags.go:64] FLAG: --eviction-hard="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926546 4563 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926551 4563 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926555 4563 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926559 4563 flags.go:64] FLAG: --eviction-soft="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926563 4563 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926566 4563 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926570 4563 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926574 4563 flags.go:64] FLAG: --experimental-mounter-path="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926578 4563 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926581 4563 flags.go:64] FLAG: --fail-swap-on="true" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926585 4563 flags.go:64] FLAG: --feature-gates="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926590 4563 flags.go:64] FLAG: --file-check-frequency="20s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926595 4563 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926599 4563 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926603 4563 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926606 4563 flags.go:64] FLAG: --healthz-port="10248" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926610 4563 flags.go:64] FLAG: --help="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926614 4563 flags.go:64] FLAG: --hostname-override="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926617 4563 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926621 4563 flags.go:64] FLAG: --http-check-frequency="20s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926624 4563 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926628 4563 flags.go:64] FLAG: --image-credential-provider-config="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926631 4563 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926651 4563 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926655 4563 flags.go:64] FLAG: --image-service-endpoint="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926659 4563 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926662 4563 flags.go:64] FLAG: --kube-api-burst="100" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926666 4563 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926670 4563 flags.go:64] FLAG: --kube-api-qps="50" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926674 4563 flags.go:64] FLAG: --kube-reserved="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926678 4563 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926682 4563 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926685 4563 flags.go:64] FLAG: --kubelet-cgroups="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926689 4563 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926693 4563 flags.go:64] FLAG: --lock-file="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926696 4563 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926700 4563 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926703 4563 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926710 4563 flags.go:64] FLAG: --log-json-split-stream="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926714 4563 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926717 4563 flags.go:64] FLAG: --log-text-split-stream="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926721 4563 flags.go:64] FLAG: --logging-format="text" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926724 4563 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926728 4563 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926732 4563 flags.go:64] FLAG: --manifest-url="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926737 4563 flags.go:64] FLAG: --manifest-url-header="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926748 4563 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926752 4563 flags.go:64] FLAG: --max-open-files="1000000" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926757 4563 flags.go:64] FLAG: --max-pods="110" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926761 4563 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926765 4563 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926769 4563 flags.go:64] FLAG: --memory-manager-policy="None" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926772 4563 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926776 4563 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926780 4563 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926784 4563 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926797 4563 flags.go:64] FLAG: --node-status-max-images="50" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926800 4563 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926804 4563 flags.go:64] FLAG: --oom-score-adj="-999" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926807 4563 flags.go:64] FLAG: --pod-cidr="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926811 4563 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926816 4563 flags.go:64] FLAG: --pod-manifest-path="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926820 4563 flags.go:64] FLAG: --pod-max-pids="-1" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926824 4563 flags.go:64] FLAG: --pods-per-core="0" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926829 4563 flags.go:64] FLAG: --port="10250" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926833 4563 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926837 4563 flags.go:64] FLAG: --provider-id="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926841 4563 flags.go:64] FLAG: --qos-reserved="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926844 4563 flags.go:64] FLAG: --read-only-port="10255" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926848 4563 flags.go:64] FLAG: --register-node="true" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926853 4563 flags.go:64] FLAG: --register-schedulable="true" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926857 4563 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926864 4563 flags.go:64] FLAG: --registry-burst="10" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926868 4563 flags.go:64] FLAG: --registry-qps="5" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926871 4563 flags.go:64] FLAG: --reserved-cpus="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926876 4563 flags.go:64] FLAG: --reserved-memory="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926881 4563 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926885 4563 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926889 4563 flags.go:64] FLAG: --rotate-certificates="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926892 4563 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926896 4563 flags.go:64] FLAG: --runonce="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926900 4563 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926903 4563 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926907 4563 flags.go:64] FLAG: --seccomp-default="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926911 4563 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926914 4563 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926918 4563 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926922 4563 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926926 4563 flags.go:64] FLAG: --storage-driver-password="root" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926929 4563 flags.go:64] FLAG: --storage-driver-secure="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926933 4563 flags.go:64] FLAG: --storage-driver-table="stats" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926936 4563 flags.go:64] FLAG: --storage-driver-user="root" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926940 4563 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926944 4563 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926947 4563 flags.go:64] FLAG: --system-cgroups="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926951 4563 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926959 4563 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926963 4563 flags.go:64] FLAG: --tls-cert-file="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926966 4563 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926972 4563 flags.go:64] FLAG: --tls-min-version="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926975 4563 flags.go:64] FLAG: --tls-private-key-file="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926979 4563 flags.go:64] FLAG: --topology-manager-policy="none" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926983 4563 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926987 4563 flags.go:64] FLAG: --topology-manager-scope="container" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926990 4563 flags.go:64] FLAG: --v="2" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.926996 4563 flags.go:64] FLAG: --version="false" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.927001 4563 flags.go:64] FLAG: --vmodule="" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.927006 4563 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.927009 4563 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927130 4563 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927134 4563 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927138 4563 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927141 4563 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927145 4563 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927148 4563 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927152 4563 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927155 4563 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927159 4563 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927162 4563 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927176 4563 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927179 4563 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927182 4563 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927186 4563 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927189 4563 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927192 4563 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927195 4563 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927198 4563 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927201 4563 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927206 4563 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927209 4563 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927212 4563 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927215 4563 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927218 4563 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927222 4563 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927225 4563 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927229 4563 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927232 4563 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927235 4563 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927238 4563 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927242 4563 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927245 4563 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927248 4563 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927251 4563 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927254 4563 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927259 4563 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927263 4563 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927266 4563 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927270 4563 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927273 4563 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927277 4563 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927280 4563 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927283 4563 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927286 4563 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927290 4563 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927293 4563 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927296 4563 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927299 4563 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927302 4563 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927307 4563 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927311 4563 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927316 4563 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927319 4563 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927323 4563 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927327 4563 feature_gate.go:330] unrecognized feature gate: Example Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927331 4563 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927334 4563 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927338 4563 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927341 4563 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927344 4563 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927347 4563 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927350 4563 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927353 4563 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927357 4563 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927360 4563 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927364 4563 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927368 4563 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927372 4563 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927375 4563 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927378 4563 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.927381 4563 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.927394 4563 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.933302 4563 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.933335 4563 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933401 4563 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933414 4563 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933418 4563 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933422 4563 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933426 4563 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933431 4563 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933435 4563 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933438 4563 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933441 4563 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933445 4563 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933448 4563 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933451 4563 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933455 4563 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933458 4563 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933461 4563 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933466 4563 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933470 4563 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933474 4563 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933477 4563 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933480 4563 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933484 4563 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933509 4563 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933515 4563 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933519 4563 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933523 4563 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933527 4563 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933530 4563 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933534 4563 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933539 4563 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933543 4563 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933547 4563 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933551 4563 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933555 4563 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933558 4563 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933563 4563 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933566 4563 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933569 4563 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933572 4563 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933576 4563 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933579 4563 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933582 4563 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933587 4563 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933591 4563 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933595 4563 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933598 4563 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933602 4563 feature_gate.go:330] unrecognized feature gate: Example Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933605 4563 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933609 4563 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933613 4563 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933616 4563 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933620 4563 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933623 4563 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933627 4563 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933631 4563 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933648 4563 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933652 4563 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933655 4563 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933660 4563 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933664 4563 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933667 4563 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933670 4563 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933673 4563 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933677 4563 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933680 4563 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933683 4563 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933686 4563 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933689 4563 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933693 4563 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933697 4563 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933701 4563 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933706 4563 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.933712 4563 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933821 4563 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933828 4563 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933832 4563 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933835 4563 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933839 4563 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933842 4563 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933846 4563 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933850 4563 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933853 4563 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933856 4563 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933859 4563 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933863 4563 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933866 4563 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933869 4563 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933872 4563 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933875 4563 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933878 4563 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933882 4563 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933886 4563 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933889 4563 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933893 4563 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933898 4563 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933902 4563 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933906 4563 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933910 4563 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933913 4563 feature_gate.go:330] unrecognized feature gate: Example Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933917 4563 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933921 4563 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933925 4563 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933928 4563 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933932 4563 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933935 4563 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933938 4563 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933941 4563 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933945 4563 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933949 4563 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933953 4563 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933956 4563 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933959 4563 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933962 4563 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933965 4563 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933969 4563 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933972 4563 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933975 4563 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933978 4563 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933981 4563 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933984 4563 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933988 4563 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933991 4563 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933994 4563 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.933997 4563 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934001 4563 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934006 4563 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934010 4563 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934014 4563 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934017 4563 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934021 4563 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934024 4563 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934028 4563 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934032 4563 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934036 4563 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934039 4563 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934043 4563 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934046 4563 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934050 4563 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934054 4563 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934057 4563 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934060 4563 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934063 4563 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934066 4563 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 24 09:03:42 crc kubenswrapper[4563]: W1124 09:03:42.934070 4563 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.934075 4563 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.934585 4563 server.go:940] "Client rotation is on, will bootstrap in background" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.937388 4563 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.937458 4563 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.938504 4563 server.go:997] "Starting client certificate rotation" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.938529 4563 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.938667 4563 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-28 07:42:04.050168391 +0000 UTC Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.938716 4563 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 814h38m21.111454171s for next certificate rotation Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.951020 4563 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.953069 4563 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.963976 4563 log.go:25] "Validated CRI v1 runtime API" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.982016 4563 log.go:25] "Validated CRI v1 image API" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.983425 4563 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.986675 4563 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-24-09-00-14-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 24 09:03:42 crc kubenswrapper[4563]: I1124 09:03:42.986697 4563 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.000714 4563 manager.go:217] Machine: {Timestamp:2025-11-24 09:03:42.999128118 +0000 UTC m=+0.258105585 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2445406 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:3210c6ca-f708-448c-9ff2-b003edce1c8c BootID:f656a478-9d9a-4ffb-98be-bf6c1dcaa83e Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:4108168 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:fd:2f:d9 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:fd:2f:d9 Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:b1:e4:35 Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:c7:03:68 Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:9b:ba:ba Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:d6:78:c0 Speed:-1 Mtu:1436} {Name:eth10 MacAddress:02:50:2e:ab:d9:41 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ca:7b:9c:52:ea:2c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:65536 Type:Data Level:1} {Id:10 Size:65536 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:65536 Type:Data Level:1} {Id:11 Size:65536 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:65536 Type:Data Level:1} {Id:8 Size:65536 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:65536 Type:Data Level:1} {Id:9 Size:65536 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.000914 4563 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.001031 4563 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.001886 4563 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.002038 4563 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.002060 4563 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.002218 4563 topology_manager.go:138] "Creating topology manager with none policy" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.002227 4563 container_manager_linux.go:303] "Creating device plugin manager" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.002531 4563 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.002556 4563 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.003030 4563 state_mem.go:36] "Initialized new in-memory state store" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.003100 4563 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.004562 4563 kubelet.go:418] "Attempting to sync node with API server" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.004602 4563 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.004628 4563 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.004648 4563 kubelet.go:324] "Adding apiserver pod source" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.004657 4563 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.006418 4563 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.006937 4563 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.008291 4563 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 24 09:03:43 crc kubenswrapper[4563]: W1124 09:03:43.008446 4563 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.216:6443: connect: connection refused Nov 24 09:03:43 crc kubenswrapper[4563]: W1124 09:03:43.008441 4563 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.216:6443: connect: connection refused Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.008519 4563 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.216:6443: connect: connection refused" logger="UnhandledError" Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.008519 4563 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.216:6443: connect: connection refused" logger="UnhandledError" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.009092 4563 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.009113 4563 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.009121 4563 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.009127 4563 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.009137 4563 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.009143 4563 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.009149 4563 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.009159 4563 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.009175 4563 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.009183 4563 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.009192 4563 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.009199 4563 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.009833 4563 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.010185 4563 server.go:1280] "Started kubelet" Nov 24 09:03:43 crc systemd[1]: Started Kubernetes Kubelet. Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.010599 4563 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.010543 4563 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.011684 4563 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.216:6443: connect: connection refused Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.011802 4563 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.011895 4563 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.011928 4563 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.012031 4563 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 15:44:14.202161264 +0000 UTC Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.012123 4563 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.014153 4563 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.012114 4563 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 510h40m31.190050651s for next certificate rotation Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.012098 4563 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.014904 4563 factory.go:55] Registering systemd factory Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.014926 4563 factory.go:221] Registration of the systemd container factory successfully Nov 24 09:03:43 crc kubenswrapper[4563]: W1124 09:03:43.015346 4563 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.216:6443: connect: connection refused Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.015426 4563 factory.go:153] Registering CRI-O factory Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.015441 4563 factory.go:221] Registration of the crio container factory successfully Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.015490 4563 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.015508 4563 factory.go:103] Registering Raw factory Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.015530 4563 manager.go:1196] Started watching for new ooms in manager Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.015588 4563 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.216:6443: connect: connection refused" logger="UnhandledError" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.015875 4563 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.015558 4563 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.26.216:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187ae5efff2e8bef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-24 09:03:43.010155503 +0000 UTC m=+0.269132950,LastTimestamp:2025-11-24 09:03:43.010155503 +0000 UTC m=+0.269132950,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.016444 4563 manager.go:319] Starting recovery of all containers Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.014714 4563 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.216:6443: connect: connection refused" interval="200ms" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.018397 4563 server.go:460] "Adding debug handlers to kubelet server" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023467 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023502 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023512 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023524 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023532 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023540 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023547 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023556 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023566 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023575 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023582 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023590 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023599 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023609 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023616 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023625 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023632 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023677 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023684 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023691 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023699 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023708 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023715 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023725 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023732 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023740 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023750 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023759 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023768 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023775 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023785 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023819 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023827 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023836 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023844 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023853 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023862 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023870 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023879 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023887 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023894 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023903 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023910 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023917 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023925 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023934 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023941 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023950 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023957 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023966 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023974 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023982 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.023993 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024002 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024012 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024021 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024029 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024038 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024046 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024054 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024063 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024071 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024079 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024088 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024095 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024105 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024113 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024120 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024128 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024135 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024144 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024154 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024163 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024179 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024188 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024195 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024204 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024212 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024221 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024230 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024237 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024244 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024252 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024259 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024267 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024274 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024282 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024291 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024299 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024307 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024315 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024323 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024331 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024338 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024346 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024355 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024362 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024371 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024378 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024386 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024393 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024401 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024408 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024418 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024430 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024438 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024446 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024454 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024463 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024471 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024479 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024489 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024498 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024506 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024515 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024523 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024531 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024538 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024546 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024553 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024562 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024570 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024577 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024585 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024598 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024606 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024614 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024623 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024631 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024653 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024661 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.024669 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025609 4563 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025628 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025653 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025661 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025670 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025679 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025687 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025699 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025707 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025715 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025723 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025731 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025739 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025755 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025763 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025770 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025777 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025785 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025794 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025803 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025809 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025817 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025825 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025833 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025841 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025848 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025855 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025878 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025887 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025894 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025901 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025910 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025918 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025926 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025934 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025941 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025949 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025957 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025964 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025971 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025979 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025986 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.025993 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026002 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026010 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026017 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026024 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026032 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026057 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026064 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026072 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026079 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026086 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026094 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026102 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026109 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026120 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026128 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026135 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026144 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026151 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026159 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026176 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026183 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026192 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026199 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026206 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026214 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026222 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026230 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026239 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026246 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026253 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026260 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026268 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026277 4563 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026285 4563 reconstruct.go:97] "Volume reconstruction finished" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.026290 4563 reconciler.go:26] "Reconciler: start to sync state" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.034992 4563 manager.go:324] Recovery completed Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.044705 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.046458 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.046490 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.046500 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.047042 4563 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.047061 4563 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.047092 4563 state_mem.go:36] "Initialized new in-memory state store" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.051202 4563 policy_none.go:49] "None policy: Start" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.051709 4563 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.052319 4563 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.052341 4563 state_mem.go:35] "Initializing new in-memory state store" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.053406 4563 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.053451 4563 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.053470 4563 kubelet.go:2335] "Starting kubelet main sync loop" Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.053505 4563 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 24 09:03:43 crc kubenswrapper[4563]: W1124 09:03:43.054473 4563 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.216:6443: connect: connection refused Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.054530 4563 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.216:6443: connect: connection refused" logger="UnhandledError" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.097542 4563 manager.go:334] "Starting Device Plugin manager" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.097662 4563 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.097732 4563 server.go:79] "Starting device plugin registration server" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.098005 4563 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.098133 4563 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.098416 4563 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.098526 4563 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.098576 4563 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.104567 4563 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.153749 4563 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.153840 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.154540 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.154572 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.154581 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.154686 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.154897 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.154942 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.155321 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.155360 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.155372 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.155533 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.155620 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.155653 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.155677 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.155655 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.155778 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.156465 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.156476 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.156512 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.156523 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.156494 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.156574 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.156712 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.156824 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.156865 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.157247 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.157274 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.157284 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.157374 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.157488 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.157514 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.157591 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.157621 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.157630 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.158212 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.158239 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.158247 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.158453 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.158477 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.158487 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.158628 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.158673 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.159244 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.159272 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.159281 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.199017 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.199750 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.199776 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.199786 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.199802 4563 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.200169 4563 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.216:6443: connect: connection refused" node="crc" Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.217435 4563 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.216:6443: connect: connection refused" interval="400ms" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228631 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228675 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228700 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228716 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228730 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228747 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228779 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228802 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228817 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228831 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228858 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228878 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228892 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228917 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.228958 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.329781 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.329928 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.329945 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.329959 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330073 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330087 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330099 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330111 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330159 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330001 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330244 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330018 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.329879 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330042 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330308 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330122 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330333 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330346 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330390 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330411 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330433 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330468 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330471 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330359 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330554 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330578 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330596 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330657 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330677 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.330722 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.401083 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.402375 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.402399 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.402409 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.402465 4563 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.402819 4563 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.216:6443: connect: connection refused" node="crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.484063 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.486980 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.500724 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: W1124 09:03:43.509216 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-f65ab3e33d96ab9f8c8518d86adaea16ad7ea5dff4dcd6a4fc922965ab8ef076 WatchSource:0}: Error finding container f65ab3e33d96ab9f8c8518d86adaea16ad7ea5dff4dcd6a4fc922965ab8ef076: Status 404 returned error can't find the container with id f65ab3e33d96ab9f8c8518d86adaea16ad7ea5dff4dcd6a4fc922965ab8ef076 Nov 24 09:03:43 crc kubenswrapper[4563]: W1124 09:03:43.512401 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-72e39af24880380adaa4e409292be042936c9e71ddded8fd849fdc1b4a52788c WatchSource:0}: Error finding container 72e39af24880380adaa4e409292be042936c9e71ddded8fd849fdc1b4a52788c: Status 404 returned error can't find the container with id 72e39af24880380adaa4e409292be042936c9e71ddded8fd849fdc1b4a52788c Nov 24 09:03:43 crc kubenswrapper[4563]: W1124 09:03:43.518890 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-59b2b03910fa45de9a68317a2f6b86a1fbd0e01d5a478c8783405b7aea18c881 WatchSource:0}: Error finding container 59b2b03910fa45de9a68317a2f6b86a1fbd0e01d5a478c8783405b7aea18c881: Status 404 returned error can't find the container with id 59b2b03910fa45de9a68317a2f6b86a1fbd0e01d5a478c8783405b7aea18c881 Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.523389 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.527175 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 09:03:43 crc kubenswrapper[4563]: W1124 09:03:43.534591 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-82339cac35f125be6bc94274e4b7255194ad0a7586fa010595cae9783be52a19 WatchSource:0}: Error finding container 82339cac35f125be6bc94274e4b7255194ad0a7586fa010595cae9783be52a19: Status 404 returned error can't find the container with id 82339cac35f125be6bc94274e4b7255194ad0a7586fa010595cae9783be52a19 Nov 24 09:03:43 crc kubenswrapper[4563]: W1124 09:03:43.537631 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-663f966774a8f4743500a052b63daec0785d2a3df45c4a1d563519c36f4ee35b WatchSource:0}: Error finding container 663f966774a8f4743500a052b63daec0785d2a3df45c4a1d563519c36f4ee35b: Status 404 returned error can't find the container with id 663f966774a8f4743500a052b63daec0785d2a3df45c4a1d563519c36f4ee35b Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.619146 4563 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.216:6443: connect: connection refused" interval="800ms" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.803566 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.804360 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.804390 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.804399 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:43 crc kubenswrapper[4563]: I1124 09:03:43.804418 4563 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.804724 4563 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.216:6443: connect: connection refused" node="crc" Nov 24 09:03:43 crc kubenswrapper[4563]: W1124 09:03:43.818140 4563 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.216:6443: connect: connection refused Nov 24 09:03:43 crc kubenswrapper[4563]: E1124 09:03:43.818191 4563 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.216:6443: connect: connection refused" logger="UnhandledError" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.012848 4563 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.216:6443: connect: connection refused Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.062292 4563 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59" exitCode=0 Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.062353 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59"} Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.062480 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f65ab3e33d96ab9f8c8518d86adaea16ad7ea5dff4dcd6a4fc922965ab8ef076"} Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.062612 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.063660 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.063684 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.063692 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.063688 4563 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="c71f1a45af95e970bf803b18f56400db645e493ba0aca4162bf68c018731155d" exitCode=0 Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.063752 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"c71f1a45af95e970bf803b18f56400db645e493ba0aca4162bf68c018731155d"} Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.063770 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"72e39af24880380adaa4e409292be042936c9e71ddded8fd849fdc1b4a52788c"} Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.063833 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.064659 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.064682 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.064690 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.064984 4563 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab" exitCode=0 Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.065028 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab"} Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.065065 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"663f966774a8f4743500a052b63daec0785d2a3df45c4a1d563519c36f4ee35b"} Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.065117 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.065784 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.065807 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.065815 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.066346 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9"} Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.066366 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"82339cac35f125be6bc94274e4b7255194ad0a7586fa010595cae9783be52a19"} Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.068233 4563 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6" exitCode=0 Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.068257 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6"} Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.068274 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"59b2b03910fa45de9a68317a2f6b86a1fbd0e01d5a478c8783405b7aea18c881"} Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.068337 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.068889 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.068913 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.068923 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.069958 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.070425 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.070450 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.070459 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:44 crc kubenswrapper[4563]: W1124 09:03:44.276701 4563 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.216:6443: connect: connection refused Nov 24 09:03:44 crc kubenswrapper[4563]: E1124 09:03:44.276778 4563 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.216:6443: connect: connection refused" logger="UnhandledError" Nov 24 09:03:44 crc kubenswrapper[4563]: W1124 09:03:44.276570 4563 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.216:6443: connect: connection refused Nov 24 09:03:44 crc kubenswrapper[4563]: E1124 09:03:44.276897 4563 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.216:6443: connect: connection refused" logger="UnhandledError" Nov 24 09:03:44 crc kubenswrapper[4563]: E1124 09:03:44.419962 4563 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.216:6443: connect: connection refused" interval="1.6s" Nov 24 09:03:44 crc kubenswrapper[4563]: W1124 09:03:44.469899 4563 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.216:6443: connect: connection refused Nov 24 09:03:44 crc kubenswrapper[4563]: E1124 09:03:44.469980 4563 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.216:6443: connect: connection refused" logger="UnhandledError" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.605496 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.606347 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.606382 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.606393 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:44 crc kubenswrapper[4563]: I1124 09:03:44.606424 4563 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 24 09:03:44 crc kubenswrapper[4563]: E1124 09:03:44.606795 4563 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.216:6443: connect: connection refused" node="crc" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.071026 4563 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a" exitCode=0 Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.071082 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a"} Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.071169 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.071945 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.071981 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.071990 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.073999 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"de9057e94769992c75e6445a5c816164c64ed64f9dd2b07e8b317a7e17654f77"} Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.074051 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.074693 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.074729 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.074738 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.076714 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"265b586b95a5c747bfe4c1a82a536da07d8fded3433c572e7a73ec381565fb6f"} Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.076745 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"13d718684a773c9c1c92ab5722ae89afe44cdd4485b7df17e2d2795ae64645df"} Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.076757 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"76d4c2140ed99b80bf42fb32fc17368efd71c90e29906dd340bcca1a5bb5fabc"} Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.076816 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.077433 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.077462 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.077471 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.079216 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22"} Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.079235 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e"} Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.079246 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9"} Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.079262 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.079972 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.079999 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.080008 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.081910 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed"} Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.081943 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893"} Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.081953 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354"} Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.081963 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57"} Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.081971 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b"} Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.082028 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.082558 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.082584 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:45 crc kubenswrapper[4563]: I1124 09:03:45.082592 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.087602 4563 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652" exitCode=0 Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.087719 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.088074 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652"} Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.088139 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.088506 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.088534 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.088544 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.088659 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.088678 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.088686 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.207575 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.208344 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.208376 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.208386 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.208405 4563 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 24 09:03:46 crc kubenswrapper[4563]: I1124 09:03:46.317579 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:03:47 crc kubenswrapper[4563]: I1124 09:03:47.093094 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6"} Nov 24 09:03:47 crc kubenswrapper[4563]: I1124 09:03:47.093143 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1"} Nov 24 09:03:47 crc kubenswrapper[4563]: I1124 09:03:47.093153 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169"} Nov 24 09:03:47 crc kubenswrapper[4563]: I1124 09:03:47.093138 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:47 crc kubenswrapper[4563]: I1124 09:03:47.093214 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:47 crc kubenswrapper[4563]: I1124 09:03:47.093304 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941"} Nov 24 09:03:47 crc kubenswrapper[4563]: I1124 09:03:47.093333 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e"} Nov 24 09:03:47 crc kubenswrapper[4563]: I1124 09:03:47.093884 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:47 crc kubenswrapper[4563]: I1124 09:03:47.093902 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:47 crc kubenswrapper[4563]: I1124 09:03:47.093910 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:47 crc kubenswrapper[4563]: I1124 09:03:47.093964 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:47 crc kubenswrapper[4563]: I1124 09:03:47.093982 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:47 crc kubenswrapper[4563]: I1124 09:03:47.093990 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:47 crc kubenswrapper[4563]: I1124 09:03:47.557711 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 24 09:03:48 crc kubenswrapper[4563]: I1124 09:03:48.094595 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:48 crc kubenswrapper[4563]: I1124 09:03:48.095193 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:48 crc kubenswrapper[4563]: I1124 09:03:48.095218 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:48 crc kubenswrapper[4563]: I1124 09:03:48.095225 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:49 crc kubenswrapper[4563]: I1124 09:03:49.096479 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:49 crc kubenswrapper[4563]: I1124 09:03:49.097177 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:49 crc kubenswrapper[4563]: I1124 09:03:49.097217 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:49 crc kubenswrapper[4563]: I1124 09:03:49.097227 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:49 crc kubenswrapper[4563]: I1124 09:03:49.179247 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:49 crc kubenswrapper[4563]: I1124 09:03:49.179400 4563 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 09:03:49 crc kubenswrapper[4563]: I1124 09:03:49.179434 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:49 crc kubenswrapper[4563]: I1124 09:03:49.180568 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:49 crc kubenswrapper[4563]: I1124 09:03:49.180596 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:49 crc kubenswrapper[4563]: I1124 09:03:49.180605 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:49 crc kubenswrapper[4563]: I1124 09:03:49.315499 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:49 crc kubenswrapper[4563]: I1124 09:03:49.538436 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:50 crc kubenswrapper[4563]: I1124 09:03:50.098937 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:50 crc kubenswrapper[4563]: I1124 09:03:50.099720 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:50 crc kubenswrapper[4563]: I1124 09:03:50.099816 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:50 crc kubenswrapper[4563]: I1124 09:03:50.099895 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:50 crc kubenswrapper[4563]: I1124 09:03:50.779092 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 09:03:50 crc kubenswrapper[4563]: I1124 09:03:50.779263 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:50 crc kubenswrapper[4563]: I1124 09:03:50.780240 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:50 crc kubenswrapper[4563]: I1124 09:03:50.780271 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:50 crc kubenswrapper[4563]: I1124 09:03:50.780282 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:51 crc kubenswrapper[4563]: I1124 09:03:51.100419 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:51 crc kubenswrapper[4563]: I1124 09:03:51.101275 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:51 crc kubenswrapper[4563]: I1124 09:03:51.101311 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:51 crc kubenswrapper[4563]: I1124 09:03:51.101320 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:51 crc kubenswrapper[4563]: I1124 09:03:51.217586 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 24 09:03:51 crc kubenswrapper[4563]: I1124 09:03:51.217711 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:51 crc kubenswrapper[4563]: I1124 09:03:51.218485 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:51 crc kubenswrapper[4563]: I1124 09:03:51.218522 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:51 crc kubenswrapper[4563]: I1124 09:03:51.218531 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:51 crc kubenswrapper[4563]: I1124 09:03:51.648001 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:03:51 crc kubenswrapper[4563]: I1124 09:03:51.648163 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:51 crc kubenswrapper[4563]: I1124 09:03:51.649051 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:51 crc kubenswrapper[4563]: I1124 09:03:51.649082 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:51 crc kubenswrapper[4563]: I1124 09:03:51.649091 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:52 crc kubenswrapper[4563]: I1124 09:03:52.712422 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:03:52 crc kubenswrapper[4563]: I1124 09:03:52.712573 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:52 crc kubenswrapper[4563]: I1124 09:03:52.713692 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:52 crc kubenswrapper[4563]: I1124 09:03:52.713742 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:52 crc kubenswrapper[4563]: I1124 09:03:52.713751 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:53 crc kubenswrapper[4563]: E1124 09:03:53.104673 4563 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 24 09:03:54 crc kubenswrapper[4563]: I1124 09:03:54.125726 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:03:54 crc kubenswrapper[4563]: I1124 09:03:54.125853 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:54 crc kubenswrapper[4563]: I1124 09:03:54.126587 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:54 crc kubenswrapper[4563]: I1124 09:03:54.126613 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:54 crc kubenswrapper[4563]: I1124 09:03:54.126623 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:54 crc kubenswrapper[4563]: I1124 09:03:54.130128 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:03:54 crc kubenswrapper[4563]: I1124 09:03:54.645391 4563 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 24 09:03:54 crc kubenswrapper[4563]: I1124 09:03:54.645442 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 24 09:03:54 crc kubenswrapper[4563]: I1124 09:03:54.648108 4563 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 24 09:03:54 crc kubenswrapper[4563]: I1124 09:03:54.648175 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 09:03:54 crc kubenswrapper[4563]: I1124 09:03:54.648960 4563 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Nov 24 09:03:54 crc kubenswrapper[4563]: I1124 09:03:54.648991 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 24 09:03:55 crc kubenswrapper[4563]: I1124 09:03:55.108327 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:55 crc kubenswrapper[4563]: I1124 09:03:55.109178 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:55 crc kubenswrapper[4563]: I1124 09:03:55.109213 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:55 crc kubenswrapper[4563]: I1124 09:03:55.109221 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:55 crc kubenswrapper[4563]: I1124 09:03:55.111418 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:03:55 crc kubenswrapper[4563]: I1124 09:03:55.214218 4563 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 24 09:03:55 crc kubenswrapper[4563]: I1124 09:03:55.214261 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 24 09:03:56 crc kubenswrapper[4563]: I1124 09:03:56.110285 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:56 crc kubenswrapper[4563]: I1124 09:03:56.110969 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:56 crc kubenswrapper[4563]: I1124 09:03:56.111031 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:56 crc kubenswrapper[4563]: I1124 09:03:56.111041 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.182835 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.182955 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.183371 4563 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.183496 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.183671 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.183726 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.183752 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.185986 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.539481 4563 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.539533 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 24 09:03:59 crc kubenswrapper[4563]: E1124 09:03:59.642484 4563 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.644038 4563 trace.go:236] Trace[1182168933]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Nov-2025 09:03:46.846) (total time: 12797ms): Nov 24 09:03:59 crc kubenswrapper[4563]: Trace[1182168933]: ---"Objects listed" error: 12797ms (09:03:59.643) Nov 24 09:03:59 crc kubenswrapper[4563]: Trace[1182168933]: [12.797692721s] [12.797692721s] END Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.644067 4563 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.644887 4563 trace.go:236] Trace[720447834]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Nov-2025 09:03:45.492) (total time: 14151ms): Nov 24 09:03:59 crc kubenswrapper[4563]: Trace[720447834]: ---"Objects listed" error: 14151ms (09:03:59.644) Nov 24 09:03:59 crc kubenswrapper[4563]: Trace[720447834]: [14.151929665s] [14.151929665s] END Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.644912 4563 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.644976 4563 trace.go:236] Trace[1147920013]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Nov-2025 09:03:46.951) (total time: 12693ms): Nov 24 09:03:59 crc kubenswrapper[4563]: Trace[1147920013]: ---"Objects listed" error: 12693ms (09:03:59.644) Nov 24 09:03:59 crc kubenswrapper[4563]: Trace[1147920013]: [12.693591598s] [12.693591598s] END Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.644987 4563 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 24 09:03:59 crc kubenswrapper[4563]: E1124 09:03:59.645724 4563 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.646075 4563 trace.go:236] Trace[1929033789]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Nov-2025 09:03:47.325) (total time: 12320ms): Nov 24 09:03:59 crc kubenswrapper[4563]: Trace[1929033789]: ---"Objects listed" error: 12320ms (09:03:59.645) Nov 24 09:03:59 crc kubenswrapper[4563]: Trace[1929033789]: [12.32060231s] [12.32060231s] END Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.646091 4563 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 24 09:03:59 crc kubenswrapper[4563]: I1124 09:03:59.646569 4563 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.012827 4563 apiserver.go:52] "Watching apiserver" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.015070 4563 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.015340 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.015710 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.015766 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.015787 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.015855 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.015873 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.015894 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.015910 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.015727 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.016019 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.017190 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.017398 4563 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.017433 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.017883 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.018093 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.017936 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.018010 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.018274 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.018339 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.018295 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049299 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049334 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049353 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049367 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049381 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049397 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049413 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049427 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049442 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049458 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049473 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049490 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049503 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049515 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049528 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049541 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049556 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049574 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049603 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049617 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049631 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049659 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049672 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049686 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049693 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049700 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049740 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049751 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049759 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049788 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049815 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049837 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049855 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049856 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049872 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049888 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049901 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049914 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049929 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049942 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049956 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049970 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.049984 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050000 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050000 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050015 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050083 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050109 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050131 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050148 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050169 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050186 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050204 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050220 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050242 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050259 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050276 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050292 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050308 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050325 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050343 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050359 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050376 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050394 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050411 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050428 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050446 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050462 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050480 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050499 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050519 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050536 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050554 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050573 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050605 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050623 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050658 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050678 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050699 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050716 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050733 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050750 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050768 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050787 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050805 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050824 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050842 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050861 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050878 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050891 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050905 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050919 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050957 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050972 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050986 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051001 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051015 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051031 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051044 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051058 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051076 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051091 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051106 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051120 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051136 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051151 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051165 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051181 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051204 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051219 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051233 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051247 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051262 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051277 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051291 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051305 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051319 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051333 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051346 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051361 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051374 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051411 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051425 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051438 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051452 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051466 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051480 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051495 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051510 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051524 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051539 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051554 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051568 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051595 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051612 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051631 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051658 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051673 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051687 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051702 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051719 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051734 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051748 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051762 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051776 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051790 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051809 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051824 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051838 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051853 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051867 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051881 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051898 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051913 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051928 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051941 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051955 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051969 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051986 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052001 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052015 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052031 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052045 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052060 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052136 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052153 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052168 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052185 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052200 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052215 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052249 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052266 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052282 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052297 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052317 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052333 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052348 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052362 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052376 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052392 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052407 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052423 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052437 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052453 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052469 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052484 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052499 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052513 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052528 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052543 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052558 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052574 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052602 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052616 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052646 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052663 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050185 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052680 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050331 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052696 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052718 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052733 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052768 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052788 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052808 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052823 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052842 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052858 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052875 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052892 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052910 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052926 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052945 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052961 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052978 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052995 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053042 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053054 4563 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053066 4563 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053076 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053085 4563 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053094 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053104 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053525 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058559 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.059395 4563 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060815 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050376 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050426 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050409 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050467 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050483 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050647 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050691 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050760 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050751 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050803 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050819 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050857 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050911 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050928 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050945 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.050984 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051001 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051130 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051154 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051173 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051214 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051224 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.051879 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.052931 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053019 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.053169 4563 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053165 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053341 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053521 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053534 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053754 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053840 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053852 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.053856 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.054040 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.054090 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.054119 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.054168 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.054479 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.054538 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.054725 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.054832 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.054840 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.054978 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.054988 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055014 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055055 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055115 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055161 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055236 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055239 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055374 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055383 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055397 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055411 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055704 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055776 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055270 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055897 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.055956 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.056030 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.056112 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.063018 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.056129 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.056214 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.056490 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.056607 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.064068 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.056656 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.056721 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:04:00.556671775 +0000 UTC m=+17.815649222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.064090 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.057052 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.057226 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.057274 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.057309 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.057340 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.057362 4563 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.064189 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:00.564152697 +0000 UTC m=+17.823130144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.057381 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.057714 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.064239 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.057802 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058081 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058089 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058098 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058144 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058203 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058215 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058156 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058334 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058474 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058513 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058572 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058618 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058657 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058354 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058843 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.058893 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.059112 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.059408 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.059442 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.059559 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.059574 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.059676 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.059747 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.059762 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.059894 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.059958 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060192 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060207 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060339 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060398 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060475 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060507 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060695 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060768 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060776 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060787 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060791 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060894 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060896 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060973 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060987 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.060998 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061029 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061087 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061221 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061246 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061304 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061310 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061312 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061166 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061562 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061740 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061754 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061790 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061905 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061920 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.061950 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.062014 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.062073 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.062171 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.062443 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.062547 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.062625 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.062675 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.062723 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.063106 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.063656 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.063731 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.063750 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.063864 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.064352 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:00.564344007 +0000 UTC m=+17.823321453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.064703 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.064726 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.064645 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.064855 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.064901 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.064986 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.065717 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.065955 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.066022 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.066338 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.066535 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.066711 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.066719 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.072036 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.072349 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.072366 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.072384 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.072395 4563 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.072505 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:00.572492315 +0000 UTC m=+17.831469762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.072607 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.073105 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.073133 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.073147 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.073262 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.073340 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.073387 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.073403 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.073533 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.073554 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.073552 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.073573 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.073778 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.073792 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.073803 4563 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.073836 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:00.573826698 +0000 UTC m=+17.832804145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.075439 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.077790 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.077828 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.077868 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.078022 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.078271 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.079075 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.079303 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.079751 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.080194 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.080253 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.080312 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.080382 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.080593 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.080622 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.082381 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.085100 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.088224 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.089561 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.093763 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.095419 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.096461 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.103472 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.118557 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.121281 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.123060 4563 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed" exitCode=255 Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.123088 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed"} Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.125683 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.130461 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.130764 4563 scope.go:117] "RemoveContainer" containerID="6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.132771 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.139484 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.148165 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.154174 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.154303 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.154680 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.154813 4563 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.154886 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.154939 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.154995 4563 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155048 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155097 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155101 4563 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155138 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155148 4563 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155159 4563 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155168 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155176 4563 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155185 4563 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155193 4563 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155201 4563 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155209 4563 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155223 4563 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155231 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155241 4563 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155256 4563 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155264 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155272 4563 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155279 4563 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155287 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155295 4563 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155302 4563 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155310 4563 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155318 4563 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155326 4563 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155335 4563 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155343 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155350 4563 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155357 4563 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155365 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155374 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155382 4563 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155390 4563 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155397 4563 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155404 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155412 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155419 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155427 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155435 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155443 4563 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155451 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155460 4563 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155468 4563 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155475 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155483 4563 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155490 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155497 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155504 4563 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155512 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155519 4563 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155526 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155533 4563 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155541 4563 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155548 4563 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155556 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155563 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155571 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155596 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155605 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155612 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155620 4563 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155629 4563 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155655 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155663 4563 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155671 4563 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155678 4563 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155686 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155694 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155701 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155708 4563 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155717 4563 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155725 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155733 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155741 4563 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155749 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155756 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155764 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155773 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155781 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155789 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155797 4563 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155804 4563 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155812 4563 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155820 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155826 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155834 4563 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155841 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155851 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155858 4563 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155865 4563 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155872 4563 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155879 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155887 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155894 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155901 4563 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155909 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155916 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155923 4563 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155930 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155938 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155945 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155953 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155961 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155971 4563 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155977 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.155994 4563 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156002 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156010 4563 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156025 4563 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156032 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156041 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156048 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156056 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156063 4563 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156070 4563 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156078 4563 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156085 4563 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156094 4563 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156101 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156108 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156115 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156122 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156129 4563 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156136 4563 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156143 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156150 4563 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156161 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156168 4563 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156175 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156182 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156189 4563 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156197 4563 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156204 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156212 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156219 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156229 4563 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156237 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156245 4563 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156252 4563 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156260 4563 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156267 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156274 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156281 4563 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156289 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156296 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156304 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156316 4563 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156324 4563 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156331 4563 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156338 4563 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156346 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156354 4563 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156362 4563 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156369 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156377 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156384 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156391 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156399 4563 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156406 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156413 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156421 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156427 4563 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156435 4563 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156442 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156449 4563 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156456 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156465 4563 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156473 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156481 4563 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156488 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156495 4563 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156503 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156510 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156518 4563 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156525 4563 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156533 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156540 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156547 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156554 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156562 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156570 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156590 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156598 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.156605 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.160004 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.167923 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.173961 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.180405 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.187376 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.196075 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.201672 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.207511 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.325534 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 24 09:04:00 crc kubenswrapper[4563]: W1124 09:04:00.333282 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-1ec8fa253ca790dffc7f940994aa1fe33b2c84e7bbe924317726e4ea9d5ebe91 WatchSource:0}: Error finding container 1ec8fa253ca790dffc7f940994aa1fe33b2c84e7bbe924317726e4ea9d5ebe91: Status 404 returned error can't find the container with id 1ec8fa253ca790dffc7f940994aa1fe33b2c84e7bbe924317726e4ea9d5ebe91 Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.333833 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.341404 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 24 09:04:00 crc kubenswrapper[4563]: W1124 09:04:00.343311 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-2ecd2469e37a9d31f23c7a3f07a180b3836c5030a3307a31d4da08b0b98eb543 WatchSource:0}: Error finding container 2ecd2469e37a9d31f23c7a3f07a180b3836c5030a3307a31d4da08b0b98eb543: Status 404 returned error can't find the container with id 2ecd2469e37a9d31f23c7a3f07a180b3836c5030a3307a31d4da08b0b98eb543 Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.558929 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.559064 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:04:01.559040456 +0000 UTC m=+18.818017903 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.660392 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.660459 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.660480 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:00 crc kubenswrapper[4563]: I1124 09:04:00.660499 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.660564 4563 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.660630 4563 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.660660 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:01.660620084 +0000 UTC m=+18.919597531 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.660705 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.660740 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.660752 4563 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.660758 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:01.660726234 +0000 UTC m=+18.919703691 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.660796 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.660862 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.660882 4563 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.660810 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:01.660795876 +0000 UTC m=+18.919773323 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:00 crc kubenswrapper[4563]: E1124 09:04:00.660986 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:01.660973429 +0000 UTC m=+18.919950886 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.057467 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.058000 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.058861 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.059379 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.059917 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.060337 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.060889 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.061398 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.061967 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.062443 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.062957 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.063554 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.064021 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.064474 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.067013 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.067499 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.068131 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.069298 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.069885 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.070414 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.070869 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.071707 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.072419 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.073093 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.074822 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.075591 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.077229 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.077883 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.079296 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.079877 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.080867 4563 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.080997 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.082882 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.083412 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.084300 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.085920 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.086698 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.087658 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.088410 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.089487 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.090118 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.091205 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.091889 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.092906 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.093394 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.094444 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.095067 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.096219 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.096770 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.097661 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.098196 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.099286 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.099906 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.100404 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.127443 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.129367 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4"} Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.129659 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.130293 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8b3622caf26d7f666e2f3c357c0422d6ffd65f4a8849ebc2ec21244b76535ab1"} Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.131477 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60"} Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.131545 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293"} Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.131562 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2ecd2469e37a9d31f23c7a3f07a180b3836c5030a3307a31d4da08b0b98eb543"} Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.133677 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956"} Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.133717 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1ec8fa253ca790dffc7f940994aa1fe33b2c84e7bbe924317726e4ea9d5ebe91"} Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.142086 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.153296 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.165231 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.175195 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.187424 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.196832 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.205754 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.215606 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.223839 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.246424 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.251042 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.263832 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.273039 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.277830 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.285339 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.295711 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.304438 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.315414 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.324523 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.333242 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.341931 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.357470 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.369195 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.378081 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.387509 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.567672 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:04:01 crc kubenswrapper[4563]: E1124 09:04:01.567895 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:04:03.56785716 +0000 UTC m=+20.826834617 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.651986 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.659924 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.661931 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.666331 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.668613 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.668680 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.668711 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.668760 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:01 crc kubenswrapper[4563]: E1124 09:04:01.668846 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:04:01 crc kubenswrapper[4563]: E1124 09:04:01.668907 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:04:01 crc kubenswrapper[4563]: E1124 09:04:01.668855 4563 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:04:01 crc kubenswrapper[4563]: E1124 09:04:01.668930 4563 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:01 crc kubenswrapper[4563]: E1124 09:04:01.668984 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:04:01 crc kubenswrapper[4563]: E1124 09:04:01.669016 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:04:01 crc kubenswrapper[4563]: E1124 09:04:01.669033 4563 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:01 crc kubenswrapper[4563]: E1124 09:04:01.669001 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:03.66898093 +0000 UTC m=+20.927958378 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:04:01 crc kubenswrapper[4563]: E1124 09:04:01.669128 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:03.669119261 +0000 UTC m=+20.928096707 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:01 crc kubenswrapper[4563]: E1124 09:04:01.669188 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:03.669149558 +0000 UTC m=+20.928127004 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:01 crc kubenswrapper[4563]: E1124 09:04:01.669187 4563 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:04:01 crc kubenswrapper[4563]: E1124 09:04:01.669273 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:03.66924606 +0000 UTC m=+20.928223496 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.679009 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.690459 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.699411 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.709263 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.722593 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.732147 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.740467 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.754899 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.766481 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.775131 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.783373 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.792088 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.802145 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.811277 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.819765 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:01 crc kubenswrapper[4563]: I1124 09:04:01.828437 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.053740 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.053812 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.053896 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:02 crc kubenswrapper[4563]: E1124 09:04:02.054014 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:02 crc kubenswrapper[4563]: E1124 09:04:02.054156 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:02 crc kubenswrapper[4563]: E1124 09:04:02.054484 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.846444 4563 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.848667 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.848717 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.848728 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.848799 4563 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.854591 4563 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.854894 4563 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.856119 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.856150 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.856159 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.856174 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.856185 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:02Z","lastTransitionTime":"2025-11-24T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:02 crc kubenswrapper[4563]: E1124 09:04:02.870831 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:02Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.873879 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.873931 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.873946 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.873968 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.873982 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:02Z","lastTransitionTime":"2025-11-24T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:02 crc kubenswrapper[4563]: E1124 09:04:02.883960 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:02Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.886568 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.886613 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.886625 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.886653 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.886666 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:02Z","lastTransitionTime":"2025-11-24T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:02 crc kubenswrapper[4563]: E1124 09:04:02.895125 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:02Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.897476 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.897512 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.897522 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.897534 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.897543 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:02Z","lastTransitionTime":"2025-11-24T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:02 crc kubenswrapper[4563]: E1124 09:04:02.907045 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:02Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.909478 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.909509 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.909518 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.909533 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.909545 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:02Z","lastTransitionTime":"2025-11-24T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:02 crc kubenswrapper[4563]: E1124 09:04:02.922333 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:02Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:02 crc kubenswrapper[4563]: E1124 09:04:02.922440 4563 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.923446 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.923473 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.923484 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.923497 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:02 crc kubenswrapper[4563]: I1124 09:04:02.923507 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:02Z","lastTransitionTime":"2025-11-24T09:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.026016 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.026061 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.026071 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.026088 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.026099 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:03Z","lastTransitionTime":"2025-11-24T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.065207 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.075701 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.085233 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.095194 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.107432 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.118911 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.127413 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.127471 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.127484 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.127503 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.127516 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:03Z","lastTransitionTime":"2025-11-24T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.136164 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.138618 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49"} Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.146718 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.156086 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.165329 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.174049 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.183842 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.193021 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.201986 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.210616 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.225038 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.229591 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.229632 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.229667 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.229687 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.229703 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:03Z","lastTransitionTime":"2025-11-24T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.235519 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.244634 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.335124 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.335176 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.335384 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.335404 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.335416 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:03Z","lastTransitionTime":"2025-11-24T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.438280 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.438322 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.438333 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.438346 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.438356 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:03Z","lastTransitionTime":"2025-11-24T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.541294 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.541347 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.541359 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.541376 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.541385 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:03Z","lastTransitionTime":"2025-11-24T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.583561 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:04:03 crc kubenswrapper[4563]: E1124 09:04:03.583674 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:04:07.583613914 +0000 UTC m=+24.842591360 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.643355 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.643403 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.643418 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.643438 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.643451 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:03Z","lastTransitionTime":"2025-11-24T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.684902 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.684955 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.684984 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.685017 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:03 crc kubenswrapper[4563]: E1124 09:04:03.685115 4563 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:04:03 crc kubenswrapper[4563]: E1124 09:04:03.685174 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:04:03 crc kubenswrapper[4563]: E1124 09:04:03.685197 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:04:03 crc kubenswrapper[4563]: E1124 09:04:03.685224 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:04:03 crc kubenswrapper[4563]: E1124 09:04:03.685240 4563 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:03 crc kubenswrapper[4563]: E1124 09:04:03.685182 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:07.685168265 +0000 UTC m=+24.944145713 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:04:03 crc kubenswrapper[4563]: E1124 09:04:03.685206 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:04:03 crc kubenswrapper[4563]: E1124 09:04:03.685317 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:07.68529801 +0000 UTC m=+24.944275457 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:03 crc kubenswrapper[4563]: E1124 09:04:03.685325 4563 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:03 crc kubenswrapper[4563]: E1124 09:04:03.685113 4563 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:04:03 crc kubenswrapper[4563]: E1124 09:04:03.685365 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:07.685358573 +0000 UTC m=+24.944336021 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:04:03 crc kubenswrapper[4563]: E1124 09:04:03.685399 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:07.685376799 +0000 UTC m=+24.944354245 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.745373 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.745402 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.745416 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.745432 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.745442 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:03Z","lastTransitionTime":"2025-11-24T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.847671 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.847707 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.847718 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.847733 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.847742 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:03Z","lastTransitionTime":"2025-11-24T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.949627 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.949688 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.949697 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.949714 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:03 crc kubenswrapper[4563]: I1124 09:04:03.949723 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:03Z","lastTransitionTime":"2025-11-24T09:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.051339 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.051383 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.051392 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.051406 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.051417 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:04Z","lastTransitionTime":"2025-11-24T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.054594 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.054657 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.054704 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:04 crc kubenswrapper[4563]: E1124 09:04:04.054810 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:04 crc kubenswrapper[4563]: E1124 09:04:04.054881 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:04 crc kubenswrapper[4563]: E1124 09:04:04.054950 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.153322 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.153353 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.153364 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.153402 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.153418 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:04Z","lastTransitionTime":"2025-11-24T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.255909 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.255957 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.255967 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.255982 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.255991 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:04Z","lastTransitionTime":"2025-11-24T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.357614 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.357688 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.357700 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.357716 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.357732 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:04Z","lastTransitionTime":"2025-11-24T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.460486 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.460532 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.460540 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.460555 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.460564 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:04Z","lastTransitionTime":"2025-11-24T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.562914 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.562994 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.563005 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.563032 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.563044 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:04Z","lastTransitionTime":"2025-11-24T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.665566 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.665623 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.665633 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.665670 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.665680 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:04Z","lastTransitionTime":"2025-11-24T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.767626 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.767674 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.767684 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.767697 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.767708 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:04Z","lastTransitionTime":"2025-11-24T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.869875 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.869961 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.869973 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.870001 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.870016 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:04Z","lastTransitionTime":"2025-11-24T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.971946 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.972000 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.972010 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.972022 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:04 crc kubenswrapper[4563]: I1124 09:04:04.972031 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:04Z","lastTransitionTime":"2025-11-24T09:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.073694 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.073739 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.073748 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.073761 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.073769 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:05Z","lastTransitionTime":"2025-11-24T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.176083 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.176119 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.176128 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.176140 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.176152 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:05Z","lastTransitionTime":"2025-11-24T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.278212 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.278273 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.278283 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.278296 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.278304 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:05Z","lastTransitionTime":"2025-11-24T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.379957 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.379991 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.380001 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.380015 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.380023 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:05Z","lastTransitionTime":"2025-11-24T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.482169 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.482215 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.482224 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.482238 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.482246 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:05Z","lastTransitionTime":"2025-11-24T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.584229 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.584266 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.584274 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.584286 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.584294 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:05Z","lastTransitionTime":"2025-11-24T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.686031 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.686257 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.686349 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.686422 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.686476 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:05Z","lastTransitionTime":"2025-11-24T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.789152 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.789332 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.789394 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.789451 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.789529 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:05Z","lastTransitionTime":"2025-11-24T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.891924 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.891955 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.891964 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.891976 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.891986 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:05Z","lastTransitionTime":"2025-11-24T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.978820 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-7jjh2"] Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.979219 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7jjh2" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.980886 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.981207 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.981364 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.989947 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:05Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.993656 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.993682 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.993693 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.993707 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.993717 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:05Z","lastTransitionTime":"2025-11-24T09:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:05 crc kubenswrapper[4563]: I1124 09:04:05.998974 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:05Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.007293 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.016889 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.023270 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.035304 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.043775 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.051885 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.053591 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.053659 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.053595 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:06 crc kubenswrapper[4563]: E1124 09:04:06.053692 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:06 crc kubenswrapper[4563]: E1124 09:04:06.053776 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:06 crc kubenswrapper[4563]: E1124 09:04:06.053841 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.060452 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.067326 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.095700 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.095731 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.095740 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.095752 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.095761 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:06Z","lastTransitionTime":"2025-11-24T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.106029 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/223c299f-bbf0-4b77-9792-045c08cbfb0d-hosts-file\") pod \"node-resolver-7jjh2\" (UID: \"223c299f-bbf0-4b77-9792-045c08cbfb0d\") " pod="openshift-dns/node-resolver-7jjh2" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.106073 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6clq\" (UniqueName: \"kubernetes.io/projected/223c299f-bbf0-4b77-9792-045c08cbfb0d-kube-api-access-p6clq\") pod \"node-resolver-7jjh2\" (UID: \"223c299f-bbf0-4b77-9792-045c08cbfb0d\") " pod="openshift-dns/node-resolver-7jjh2" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.197225 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.197258 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.197266 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.197278 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.197295 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:06Z","lastTransitionTime":"2025-11-24T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.206848 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6clq\" (UniqueName: \"kubernetes.io/projected/223c299f-bbf0-4b77-9792-045c08cbfb0d-kube-api-access-p6clq\") pod \"node-resolver-7jjh2\" (UID: \"223c299f-bbf0-4b77-9792-045c08cbfb0d\") " pod="openshift-dns/node-resolver-7jjh2" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.206892 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/223c299f-bbf0-4b77-9792-045c08cbfb0d-hosts-file\") pod \"node-resolver-7jjh2\" (UID: \"223c299f-bbf0-4b77-9792-045c08cbfb0d\") " pod="openshift-dns/node-resolver-7jjh2" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.206971 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/223c299f-bbf0-4b77-9792-045c08cbfb0d-hosts-file\") pod \"node-resolver-7jjh2\" (UID: \"223c299f-bbf0-4b77-9792-045c08cbfb0d\") " pod="openshift-dns/node-resolver-7jjh2" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.220949 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6clq\" (UniqueName: \"kubernetes.io/projected/223c299f-bbf0-4b77-9792-045c08cbfb0d-kube-api-access-p6clq\") pod \"node-resolver-7jjh2\" (UID: \"223c299f-bbf0-4b77-9792-045c08cbfb0d\") " pod="openshift-dns/node-resolver-7jjh2" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.287944 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7jjh2" Nov 24 09:04:06 crc kubenswrapper[4563]: W1124 09:04:06.296160 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod223c299f_bbf0_4b77_9792_045c08cbfb0d.slice/crio-53758b3d872053eb13d6c0b138c8326e205538e64861baaf72dd44bab2004926 WatchSource:0}: Error finding container 53758b3d872053eb13d6c0b138c8326e205538e64861baaf72dd44bab2004926: Status 404 returned error can't find the container with id 53758b3d872053eb13d6c0b138c8326e205538e64861baaf72dd44bab2004926 Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.298476 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.298495 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.298503 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.298515 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.298524 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:06Z","lastTransitionTime":"2025-11-24T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.340039 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-stlxr"] Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.340432 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-nw8xd"] Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.340603 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.340608 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.341888 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-7qphz"] Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.342338 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.343156 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.343260 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.343317 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.343343 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.343474 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.344419 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.344442 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.344520 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.344554 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.344615 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.345283 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.345372 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.360932 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.369542 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.378819 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.387512 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.394023 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.400605 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.400629 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.400652 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.400665 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.400673 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:06Z","lastTransitionTime":"2025-11-24T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.402972 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.408824 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-var-lib-cni-bin\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.408852 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-cnibin\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.408866 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-os-release\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.408880 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b2bfe55-8989-49b3-bb61-e28189447627-proxy-tls\") pod \"machine-config-daemon-stlxr\" (UID: \"3b2bfe55-8989-49b3-bb61-e28189447627\") " pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.408894 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/019bd805-9123-494a-bb29-f39b924e6243-cni-binary-copy\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.408917 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-run-netns\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.408931 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-var-lib-kubelet\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.408946 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/019bd805-9123-494a-bb29-f39b924e6243-multus-daemon-config\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.408977 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-run-multus-certs\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409001 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/adf08273-4b03-4e6f-8e52-d968b8c98f99-cni-binary-copy\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409036 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np6f7\" (UniqueName: \"kubernetes.io/projected/adf08273-4b03-4e6f-8e52-d968b8c98f99-kube-api-access-np6f7\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409081 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-system-cni-dir\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409105 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-etc-kubernetes\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409127 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/adf08273-4b03-4e6f-8e52-d968b8c98f99-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409151 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm8wf\" (UniqueName: \"kubernetes.io/projected/3b2bfe55-8989-49b3-bb61-e28189447627-kube-api-access-nm8wf\") pod \"machine-config-daemon-stlxr\" (UID: \"3b2bfe55-8989-49b3-bb61-e28189447627\") " pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409195 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/adf08273-4b03-4e6f-8e52-d968b8c98f99-cnibin\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409239 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b2bfe55-8989-49b3-bb61-e28189447627-mcd-auth-proxy-config\") pod \"machine-config-daemon-stlxr\" (UID: \"3b2bfe55-8989-49b3-bb61-e28189447627\") " pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409259 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3b2bfe55-8989-49b3-bb61-e28189447627-rootfs\") pod \"machine-config-daemon-stlxr\" (UID: \"3b2bfe55-8989-49b3-bb61-e28189447627\") " pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409276 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-multus-cni-dir\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409289 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-hostroot\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409303 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-multus-conf-dir\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409323 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/adf08273-4b03-4e6f-8e52-d968b8c98f99-system-cni-dir\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409343 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8nv5\" (UniqueName: \"kubernetes.io/projected/019bd805-9123-494a-bb29-f39b924e6243-kube-api-access-f8nv5\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409358 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-run-k8s-cni-cncf-io\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409372 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/adf08273-4b03-4e6f-8e52-d968b8c98f99-os-release\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409386 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/adf08273-4b03-4e6f-8e52-d968b8c98f99-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409402 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-multus-socket-dir-parent\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.409416 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-var-lib-cni-multus\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.410410 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.419869 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.427710 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.436052 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.443448 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.451183 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.460115 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.468078 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.474306 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.482448 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.489846 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.498447 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.502896 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.502925 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.502933 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.502948 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.502957 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:06Z","lastTransitionTime":"2025-11-24T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.505770 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510681 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-etc-kubernetes\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510707 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/adf08273-4b03-4e6f-8e52-d968b8c98f99-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510723 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm8wf\" (UniqueName: \"kubernetes.io/projected/3b2bfe55-8989-49b3-bb61-e28189447627-kube-api-access-nm8wf\") pod \"machine-config-daemon-stlxr\" (UID: \"3b2bfe55-8989-49b3-bb61-e28189447627\") " pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510742 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/adf08273-4b03-4e6f-8e52-d968b8c98f99-cnibin\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510755 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b2bfe55-8989-49b3-bb61-e28189447627-mcd-auth-proxy-config\") pod \"machine-config-daemon-stlxr\" (UID: \"3b2bfe55-8989-49b3-bb61-e28189447627\") " pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510769 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-multus-cni-dir\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510782 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-hostroot\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510796 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3b2bfe55-8989-49b3-bb61-e28189447627-rootfs\") pod \"machine-config-daemon-stlxr\" (UID: \"3b2bfe55-8989-49b3-bb61-e28189447627\") " pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510809 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-multus-conf-dir\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510822 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/adf08273-4b03-4e6f-8e52-d968b8c98f99-system-cni-dir\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510835 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8nv5\" (UniqueName: \"kubernetes.io/projected/019bd805-9123-494a-bb29-f39b924e6243-kube-api-access-f8nv5\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510849 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-run-k8s-cni-cncf-io\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510862 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/adf08273-4b03-4e6f-8e52-d968b8c98f99-os-release\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510875 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/adf08273-4b03-4e6f-8e52-d968b8c98f99-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510889 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-multus-socket-dir-parent\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510902 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-var-lib-cni-multus\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510922 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-var-lib-cni-bin\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510935 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-cnibin\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510946 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-os-release\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510959 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b2bfe55-8989-49b3-bb61-e28189447627-proxy-tls\") pod \"machine-config-daemon-stlxr\" (UID: \"3b2bfe55-8989-49b3-bb61-e28189447627\") " pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.510994 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/019bd805-9123-494a-bb29-f39b924e6243-cni-binary-copy\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511013 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-run-netns\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511026 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-var-lib-kubelet\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511040 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/019bd805-9123-494a-bb29-f39b924e6243-multus-daemon-config\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511053 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-run-multus-certs\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511065 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/adf08273-4b03-4e6f-8e52-d968b8c98f99-cni-binary-copy\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511078 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np6f7\" (UniqueName: \"kubernetes.io/projected/adf08273-4b03-4e6f-8e52-d968b8c98f99-kube-api-access-np6f7\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511097 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-system-cni-dir\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511150 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-system-cni-dir\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511176 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-etc-kubernetes\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511265 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-multus-socket-dir-parent\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511288 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-multus-conf-dir\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511292 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3b2bfe55-8989-49b3-bb61-e28189447627-rootfs\") pod \"machine-config-daemon-stlxr\" (UID: \"3b2bfe55-8989-49b3-bb61-e28189447627\") " pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511307 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/adf08273-4b03-4e6f-8e52-d968b8c98f99-system-cni-dir\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511334 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-run-netns\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511376 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-var-lib-kubelet\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511416 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/adf08273-4b03-4e6f-8e52-d968b8c98f99-cnibin\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511487 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-run-k8s-cni-cncf-io\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511526 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/adf08273-4b03-4e6f-8e52-d968b8c98f99-os-release\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511760 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-run-multus-certs\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511830 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-cnibin\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511826 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-hostroot\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511850 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-var-lib-cni-multus\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511858 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-host-var-lib-cni-bin\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511937 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-multus-cni-dir\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.511995 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/019bd805-9123-494a-bb29-f39b924e6243-os-release\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.512226 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/019bd805-9123-494a-bb29-f39b924e6243-cni-binary-copy\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.512275 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/019bd805-9123-494a-bb29-f39b924e6243-multus-daemon-config\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.512287 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/adf08273-4b03-4e6f-8e52-d968b8c98f99-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.512356 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/adf08273-4b03-4e6f-8e52-d968b8c98f99-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.512582 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b2bfe55-8989-49b3-bb61-e28189447627-mcd-auth-proxy-config\") pod \"machine-config-daemon-stlxr\" (UID: \"3b2bfe55-8989-49b3-bb61-e28189447627\") " pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.512685 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/adf08273-4b03-4e6f-8e52-d968b8c98f99-cni-binary-copy\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.514097 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.515253 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b2bfe55-8989-49b3-bb61-e28189447627-proxy-tls\") pod \"machine-config-daemon-stlxr\" (UID: \"3b2bfe55-8989-49b3-bb61-e28189447627\") " pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.521499 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.523823 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8nv5\" (UniqueName: \"kubernetes.io/projected/019bd805-9123-494a-bb29-f39b924e6243-kube-api-access-f8nv5\") pod \"multus-nw8xd\" (UID: \"019bd805-9123-494a-bb29-f39b924e6243\") " pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.524386 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np6f7\" (UniqueName: \"kubernetes.io/projected/adf08273-4b03-4e6f-8e52-d968b8c98f99-kube-api-access-np6f7\") pod \"multus-additional-cni-plugins-7qphz\" (UID: \"adf08273-4b03-4e6f-8e52-d968b8c98f99\") " pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.525334 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm8wf\" (UniqueName: \"kubernetes.io/projected/3b2bfe55-8989-49b3-bb61-e28189447627-kube-api-access-nm8wf\") pod \"machine-config-daemon-stlxr\" (UID: \"3b2bfe55-8989-49b3-bb61-e28189447627\") " pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.529103 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.541781 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.550880 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.559917 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.604177 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.604209 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.604219 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.604234 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.604244 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:06Z","lastTransitionTime":"2025-11-24T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.653293 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nw8xd" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.657001 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.661718 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7qphz" Nov 24 09:04:06 crc kubenswrapper[4563]: W1124 09:04:06.662468 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod019bd805_9123_494a_bb29_f39b924e6243.slice/crio-a80683820b75f6b6c859a7d6fc1e7a2e0a0e80bef195528d717647e65e6ab589 WatchSource:0}: Error finding container a80683820b75f6b6c859a7d6fc1e7a2e0a0e80bef195528d717647e65e6ab589: Status 404 returned error can't find the container with id a80683820b75f6b6c859a7d6fc1e7a2e0a0e80bef195528d717647e65e6ab589 Nov 24 09:04:06 crc kubenswrapper[4563]: W1124 09:04:06.668959 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b2bfe55_8989_49b3_bb61_e28189447627.slice/crio-3abc033214cb59be725fc1c51fe047bbe11762ccb48561e5b230e0eac3b71d1f WatchSource:0}: Error finding container 3abc033214cb59be725fc1c51fe047bbe11762ccb48561e5b230e0eac3b71d1f: Status 404 returned error can't find the container with id 3abc033214cb59be725fc1c51fe047bbe11762ccb48561e5b230e0eac3b71d1f Nov 24 09:04:06 crc kubenswrapper[4563]: W1124 09:04:06.672460 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadf08273_4b03_4e6f_8e52_d968b8c98f99.slice/crio-82e3db34e3550f4a1b84128882c88576caa3ea49b6f135e58d4de81858764f3e WatchSource:0}: Error finding container 82e3db34e3550f4a1b84128882c88576caa3ea49b6f135e58d4de81858764f3e: Status 404 returned error can't find the container with id 82e3db34e3550f4a1b84128882c88576caa3ea49b6f135e58d4de81858764f3e Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.694696 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vgbgr"] Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.695296 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.696725 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.697022 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.697224 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.697648 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.697657 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.697889 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.699095 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.706564 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.706623 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.706632 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.706672 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.706682 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:06Z","lastTransitionTime":"2025-11-24T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.706967 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.715536 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.724702 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.746608 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.775168 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.787398 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.795967 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.803702 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.808491 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.808528 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.808538 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.808550 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.808559 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:06Z","lastTransitionTime":"2025-11-24T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.812760 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.812875 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-run-netns\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.812899 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-ovn\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.812917 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-ovnkube-script-lib\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.812932 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-kubelet\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.812962 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-etc-openvswitch\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813020 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-node-log\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813049 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-ovnkube-config\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813076 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-cni-bin\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813091 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-openvswitch\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813106 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d62m\" (UniqueName: \"kubernetes.io/projected/cee9b713-10b0-49a5-841d-fbb083faba9a-kube-api-access-5d62m\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813133 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-cni-netd\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813154 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-systemd-units\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813166 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-slash\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813178 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-var-lib-openvswitch\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813195 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-run-ovn-kubernetes\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813234 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-env-overrides\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813267 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-systemd\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813282 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-log-socket\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813300 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.813318 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cee9b713-10b0-49a5-841d-fbb083faba9a-ovn-node-metrics-cert\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.822429 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.831830 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.841037 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.852488 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.861194 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:06Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.910938 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.910973 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.910982 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.910996 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.911004 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:06Z","lastTransitionTime":"2025-11-24T09:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914276 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-cni-bin\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914305 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-openvswitch\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914339 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d62m\" (UniqueName: \"kubernetes.io/projected/cee9b713-10b0-49a5-841d-fbb083faba9a-kube-api-access-5d62m\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914362 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-cni-netd\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914384 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-systemd-units\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914418 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-slash\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914434 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-var-lib-openvswitch\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914449 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-run-ovn-kubernetes\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914459 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-slash\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914425 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-cni-bin\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914485 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-env-overrides\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914502 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-systemd\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914510 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-run-ovn-kubernetes\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914502 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-cni-netd\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914530 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-systemd-units\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914517 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-log-socket\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914564 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-log-socket\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914586 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-systemd\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914630 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914543 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-var-lib-openvswitch\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914672 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-openvswitch\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914681 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cee9b713-10b0-49a5-841d-fbb083faba9a-ovn-node-metrics-cert\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914680 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914757 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-run-netns\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914776 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-ovn\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914790 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-ovnkube-script-lib\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914822 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-kubelet\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914838 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-etc-openvswitch\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914848 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-ovn\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914855 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-node-log\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914887 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-run-netns\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914890 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-kubelet\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914907 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-etc-openvswitch\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914909 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-ovnkube-config\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.914894 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-node-log\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.915075 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-env-overrides\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.915322 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-ovnkube-script-lib\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.915419 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-ovnkube-config\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.917596 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cee9b713-10b0-49a5-841d-fbb083faba9a-ovn-node-metrics-cert\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:06 crc kubenswrapper[4563]: I1124 09:04:06.926542 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d62m\" (UniqueName: \"kubernetes.io/projected/cee9b713-10b0-49a5-841d-fbb083faba9a-kube-api-access-5d62m\") pod \"ovnkube-node-vgbgr\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.012272 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.013732 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.013770 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.013779 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.013792 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.013800 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:07Z","lastTransitionTime":"2025-11-24T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:07 crc kubenswrapper[4563]: W1124 09:04:07.020519 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcee9b713_10b0_49a5_841d_fbb083faba9a.slice/crio-8f586bd3089d03400821f573ab0381f0bcba9faa5e1d46557ce28d5e286c3f3d WatchSource:0}: Error finding container 8f586bd3089d03400821f573ab0381f0bcba9faa5e1d46557ce28d5e286c3f3d: Status 404 returned error can't find the container with id 8f586bd3089d03400821f573ab0381f0bcba9faa5e1d46557ce28d5e286c3f3d Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.115666 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.115829 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.115837 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.115850 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.115858 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:07Z","lastTransitionTime":"2025-11-24T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.148142 4563 generic.go:334] "Generic (PLEG): container finished" podID="adf08273-4b03-4e6f-8e52-d968b8c98f99" containerID="460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175" exitCode=0 Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.148204 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" event={"ID":"adf08273-4b03-4e6f-8e52-d968b8c98f99","Type":"ContainerDied","Data":"460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.148228 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" event={"ID":"adf08273-4b03-4e6f-8e52-d968b8c98f99","Type":"ContainerStarted","Data":"82e3db34e3550f4a1b84128882c88576caa3ea49b6f135e58d4de81858764f3e"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.149209 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7jjh2" event={"ID":"223c299f-bbf0-4b77-9792-045c08cbfb0d","Type":"ContainerStarted","Data":"6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.149269 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7jjh2" event={"ID":"223c299f-bbf0-4b77-9792-045c08cbfb0d","Type":"ContainerStarted","Data":"53758b3d872053eb13d6c0b138c8326e205538e64861baaf72dd44bab2004926"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.150722 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nw8xd" event={"ID":"019bd805-9123-494a-bb29-f39b924e6243","Type":"ContainerStarted","Data":"6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.150764 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nw8xd" event={"ID":"019bd805-9123-494a-bb29-f39b924e6243","Type":"ContainerStarted","Data":"a80683820b75f6b6c859a7d6fc1e7a2e0a0e80bef195528d717647e65e6ab589"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.151878 4563 generic.go:334] "Generic (PLEG): container finished" podID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerID="5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2" exitCode=0 Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.151934 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerDied","Data":"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.151950 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerStarted","Data":"8f586bd3089d03400821f573ab0381f0bcba9faa5e1d46557ce28d5e286c3f3d"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.153423 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.153448 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.153457 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"3abc033214cb59be725fc1c51fe047bbe11762ccb48561e5b230e0eac3b71d1f"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.164053 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.174461 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.191447 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.200246 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.206584 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.214694 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.218493 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.218524 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.218534 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.218546 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.218584 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:07Z","lastTransitionTime":"2025-11-24T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.222374 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.235907 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.245804 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.255010 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.264444 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.273162 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.281707 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.293774 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.300627 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.308545 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.315619 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.320185 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.320211 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.320219 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.320233 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.320242 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:07Z","lastTransitionTime":"2025-11-24T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.325079 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.333015 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.340178 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.349182 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.367906 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.406239 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.422232 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.422255 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.422263 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.422275 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.422283 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:07Z","lastTransitionTime":"2025-11-24T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.447669 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.490266 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.523807 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.523841 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.523849 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.523862 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.523873 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:07Z","lastTransitionTime":"2025-11-24T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.526925 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.567517 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.610552 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:07Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.620144 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:04:07 crc kubenswrapper[4563]: E1124 09:04:07.620271 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:04:15.620234882 +0000 UTC m=+32.879212329 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.625520 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.625550 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.625592 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.625606 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.625614 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:07Z","lastTransitionTime":"2025-11-24T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.720860 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.721030 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.721049 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.721066 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:07 crc kubenswrapper[4563]: E1124 09:04:07.720998 4563 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:04:07 crc kubenswrapper[4563]: E1124 09:04:07.721184 4563 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:04:07 crc kubenswrapper[4563]: E1124 09:04:07.721153 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:04:07 crc kubenswrapper[4563]: E1124 09:04:07.721215 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:04:07 crc kubenswrapper[4563]: E1124 09:04:07.721228 4563 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:07 crc kubenswrapper[4563]: E1124 09:04:07.721160 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:04:07 crc kubenswrapper[4563]: E1124 09:04:07.721260 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:04:07 crc kubenswrapper[4563]: E1124 09:04:07.721267 4563 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:07 crc kubenswrapper[4563]: E1124 09:04:07.721217 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:15.721206526 +0000 UTC m=+32.980183973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:04:07 crc kubenswrapper[4563]: E1124 09:04:07.721301 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:15.72128843 +0000 UTC m=+32.980265877 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:04:07 crc kubenswrapper[4563]: E1124 09:04:07.721311 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:15.721305623 +0000 UTC m=+32.980283069 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:07 crc kubenswrapper[4563]: E1124 09:04:07.721320 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:15.721316512 +0000 UTC m=+32.980293960 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.728850 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.728881 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.728891 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.728910 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.728919 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:07Z","lastTransitionTime":"2025-11-24T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.830434 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.830472 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.830481 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.830494 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.830503 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:07Z","lastTransitionTime":"2025-11-24T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.932384 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.932415 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.932423 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.932435 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:07 crc kubenswrapper[4563]: I1124 09:04:07.932443 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:07Z","lastTransitionTime":"2025-11-24T09:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.034004 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.034035 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.034044 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.034055 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.034064 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:08Z","lastTransitionTime":"2025-11-24T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.054329 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.054367 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.054390 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:08 crc kubenswrapper[4563]: E1124 09:04:08.054416 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:08 crc kubenswrapper[4563]: E1124 09:04:08.054473 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:08 crc kubenswrapper[4563]: E1124 09:04:08.054526 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.135460 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.135588 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.135680 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.135740 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.135797 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:08Z","lastTransitionTime":"2025-11-24T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.156726 4563 generic.go:334] "Generic (PLEG): container finished" podID="adf08273-4b03-4e6f-8e52-d968b8c98f99" containerID="9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a" exitCode=0 Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.156783 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" event={"ID":"adf08273-4b03-4e6f-8e52-d968b8c98f99","Type":"ContainerDied","Data":"9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.159964 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerStarted","Data":"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.159997 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerStarted","Data":"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.160008 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerStarted","Data":"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.160017 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerStarted","Data":"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.160025 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerStarted","Data":"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.160032 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerStarted","Data":"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.171426 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:08Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.180444 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:08Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.188707 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:08Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.196802 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:08Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.211840 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:08Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.220328 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:08Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.228017 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:08Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.236965 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:08Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.237352 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.237383 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.237393 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.237426 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.237437 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:08Z","lastTransitionTime":"2025-11-24T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.244450 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:08Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.251327 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:08Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.259383 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:08Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.266154 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:08Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.274583 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:08Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.281610 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:08Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.339284 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.339305 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.339313 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.339325 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.339334 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:08Z","lastTransitionTime":"2025-11-24T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.441008 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.441039 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.441056 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.441069 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.441077 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:08Z","lastTransitionTime":"2025-11-24T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.543043 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.543070 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.543085 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.543097 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.543105 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:08Z","lastTransitionTime":"2025-11-24T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.644466 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.644493 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.644501 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.644512 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.644520 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:08Z","lastTransitionTime":"2025-11-24T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.748507 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.748546 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.748555 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.748577 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.748594 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:08Z","lastTransitionTime":"2025-11-24T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.850492 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.850525 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.850533 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.850546 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.850554 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:08Z","lastTransitionTime":"2025-11-24T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.952589 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.952651 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.952662 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.952674 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:08 crc kubenswrapper[4563]: I1124 09:04:08.952682 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:08Z","lastTransitionTime":"2025-11-24T09:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.054799 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.054833 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.054842 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.054853 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.054864 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:09Z","lastTransitionTime":"2025-11-24T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.156205 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.156374 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.156463 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.156536 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.156607 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:09Z","lastTransitionTime":"2025-11-24T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.163538 4563 generic.go:334] "Generic (PLEG): container finished" podID="adf08273-4b03-4e6f-8e52-d968b8c98f99" containerID="8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e" exitCode=0 Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.163579 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" event={"ID":"adf08273-4b03-4e6f-8e52-d968b8c98f99","Type":"ContainerDied","Data":"8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e"} Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.182490 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.191510 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.200359 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.210887 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.219471 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.227235 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.235735 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.243509 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.254100 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.259883 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.259917 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.259928 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.259941 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.259950 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:09Z","lastTransitionTime":"2025-11-24T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.261775 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.275423 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.285452 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.294611 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.303746 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.362236 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.362262 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.362275 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.362287 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.362295 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:09Z","lastTransitionTime":"2025-11-24T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.464210 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.464239 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.464247 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.464259 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.464266 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:09Z","lastTransitionTime":"2025-11-24T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.566310 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.566344 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.566352 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.566363 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.566371 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:09Z","lastTransitionTime":"2025-11-24T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.668190 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.668224 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.668232 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.668244 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.668253 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:09Z","lastTransitionTime":"2025-11-24T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.769760 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.769792 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.769800 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.769812 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.769820 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:09Z","lastTransitionTime":"2025-11-24T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.871450 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.871488 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.871497 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.871509 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.871519 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:09Z","lastTransitionTime":"2025-11-24T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.885202 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-l4cg2"] Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.885481 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l4cg2" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.886608 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.886609 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.887189 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.887775 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.896795 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.907454 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.919298 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.931941 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.937746 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffx6c\" (UniqueName: \"kubernetes.io/projected/8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa-kube-api-access-ffx6c\") pod \"node-ca-l4cg2\" (UID: \"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\") " pod="openshift-image-registry/node-ca-l4cg2" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.937784 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa-host\") pod \"node-ca-l4cg2\" (UID: \"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\") " pod="openshift-image-registry/node-ca-l4cg2" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.937819 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa-serviceca\") pod \"node-ca-l4cg2\" (UID: \"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\") " pod="openshift-image-registry/node-ca-l4cg2" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.938712 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.952435 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.961592 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.973131 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.974408 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.974497 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.974578 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.974668 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.974739 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:09Z","lastTransitionTime":"2025-11-24T09:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.981930 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.989419 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:09 crc kubenswrapper[4563]: I1124 09:04:09.998136 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:09Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.006719 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.023358 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.033079 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.038128 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa-host\") pod \"node-ca-l4cg2\" (UID: \"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\") " pod="openshift-image-registry/node-ca-l4cg2" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.038183 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa-serviceca\") pod \"node-ca-l4cg2\" (UID: \"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\") " pod="openshift-image-registry/node-ca-l4cg2" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.038202 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa-host\") pod \"node-ca-l4cg2\" (UID: \"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\") " pod="openshift-image-registry/node-ca-l4cg2" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.038310 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffx6c\" (UniqueName: \"kubernetes.io/projected/8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa-kube-api-access-ffx6c\") pod \"node-ca-l4cg2\" (UID: \"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\") " pod="openshift-image-registry/node-ca-l4cg2" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.039063 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa-serviceca\") pod \"node-ca-l4cg2\" (UID: \"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\") " pod="openshift-image-registry/node-ca-l4cg2" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.042440 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.053772 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.053783 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:10 crc kubenswrapper[4563]: E1124 09:04:10.053869 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.053923 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:10 crc kubenswrapper[4563]: E1124 09:04:10.053941 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:10 crc kubenswrapper[4563]: E1124 09:04:10.054031 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.054726 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffx6c\" (UniqueName: \"kubernetes.io/projected/8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa-kube-api-access-ffx6c\") pod \"node-ca-l4cg2\" (UID: \"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\") " pod="openshift-image-registry/node-ca-l4cg2" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.076909 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.076939 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.076950 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.076962 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.076984 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:10Z","lastTransitionTime":"2025-11-24T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.170270 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerStarted","Data":"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b"} Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.172515 4563 generic.go:334] "Generic (PLEG): container finished" podID="adf08273-4b03-4e6f-8e52-d968b8c98f99" containerID="d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a" exitCode=0 Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.172542 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" event={"ID":"adf08273-4b03-4e6f-8e52-d968b8c98f99","Type":"ContainerDied","Data":"d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a"} Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.178026 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.178057 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.178066 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.178077 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.178087 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:10Z","lastTransitionTime":"2025-11-24T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.183742 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.191750 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.194181 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l4cg2" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.205050 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: W1124 09:04:10.207924 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d6b29f6_3a4d_408e_b6fc_9f8ded8787aa.slice/crio-a132ebc6325abee93152054de03d8faa81b0080ee2401a548b23093ab203e312 WatchSource:0}: Error finding container a132ebc6325abee93152054de03d8faa81b0080ee2401a548b23093ab203e312: Status 404 returned error can't find the container with id a132ebc6325abee93152054de03d8faa81b0080ee2401a548b23093ab203e312 Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.212214 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.220678 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.229010 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.238143 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.253187 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.261848 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.270488 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.279911 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.279947 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.279956 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.279969 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.279978 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:10Z","lastTransitionTime":"2025-11-24T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.280071 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.289114 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.299134 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.309264 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.317063 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:10Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.382436 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.382504 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.382517 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.382544 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.382558 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:10Z","lastTransitionTime":"2025-11-24T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.488375 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.488411 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.488420 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.488433 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.488443 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:10Z","lastTransitionTime":"2025-11-24T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.590385 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.590423 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.590432 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.590448 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.590458 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:10Z","lastTransitionTime":"2025-11-24T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.693005 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.693042 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.693051 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.693064 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.693072 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:10Z","lastTransitionTime":"2025-11-24T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.794714 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.794852 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.794917 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.794978 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.795036 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:10Z","lastTransitionTime":"2025-11-24T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.896553 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.896671 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.896749 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.896811 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.896867 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:10Z","lastTransitionTime":"2025-11-24T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.998372 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.998831 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.998898 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.998971 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:10 crc kubenswrapper[4563]: I1124 09:04:10.999027 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:10Z","lastTransitionTime":"2025-11-24T09:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.100607 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.100633 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.100661 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.100671 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.100680 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:11Z","lastTransitionTime":"2025-11-24T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.179128 4563 generic.go:334] "Generic (PLEG): container finished" podID="adf08273-4b03-4e6f-8e52-d968b8c98f99" containerID="bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89" exitCode=0 Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.179164 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" event={"ID":"adf08273-4b03-4e6f-8e52-d968b8c98f99","Type":"ContainerDied","Data":"bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89"} Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.180714 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l4cg2" event={"ID":"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa","Type":"ContainerStarted","Data":"e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee"} Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.180738 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l4cg2" event={"ID":"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa","Type":"ContainerStarted","Data":"a132ebc6325abee93152054de03d8faa81b0080ee2401a548b23093ab203e312"} Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.188647 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.202463 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.202493 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.202505 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.202519 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.202529 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:11Z","lastTransitionTime":"2025-11-24T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.205067 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.213287 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.225763 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.234291 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.247377 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.255990 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.264275 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.272369 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.279089 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.287019 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.294144 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.303237 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.305376 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.305397 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.305405 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.305419 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.305427 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:11Z","lastTransitionTime":"2025-11-24T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.313407 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.321340 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.329347 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.341078 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.348017 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.355935 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.363574 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.375591 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.386304 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.394981 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.402180 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.408056 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.408082 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.408092 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.408106 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.408137 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:11Z","lastTransitionTime":"2025-11-24T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.411329 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.419434 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.425942 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.434014 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.442804 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.450081 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:11Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.509934 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.509972 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.509982 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.509996 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.510005 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:11Z","lastTransitionTime":"2025-11-24T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.611846 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.611873 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.611881 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.611891 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.611898 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:11Z","lastTransitionTime":"2025-11-24T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.713796 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.713827 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.713838 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.713851 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.713859 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:11Z","lastTransitionTime":"2025-11-24T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.815806 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.815846 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.815855 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.815871 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.815883 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:11Z","lastTransitionTime":"2025-11-24T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.917864 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.917900 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.917910 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.917924 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:11 crc kubenswrapper[4563]: I1124 09:04:11.917936 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:11Z","lastTransitionTime":"2025-11-24T09:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.019993 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.020022 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.020030 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.020042 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.020052 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:12Z","lastTransitionTime":"2025-11-24T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.053793 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.053812 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:12 crc kubenswrapper[4563]: E1124 09:04:12.053882 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.053895 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:12 crc kubenswrapper[4563]: E1124 09:04:12.053973 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:12 crc kubenswrapper[4563]: E1124 09:04:12.054027 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.121898 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.121928 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.121937 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.121949 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.121957 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:12Z","lastTransitionTime":"2025-11-24T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.185624 4563 generic.go:334] "Generic (PLEG): container finished" podID="adf08273-4b03-4e6f-8e52-d968b8c98f99" containerID="fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9" exitCode=0 Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.185661 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" event={"ID":"adf08273-4b03-4e6f-8e52-d968b8c98f99","Type":"ContainerDied","Data":"fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9"} Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.189740 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerStarted","Data":"7d365d90654de27d78c2b1e068bd42a93b5962a36b32e0f4ca3edd26ef3e8035"} Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.190082 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.194423 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.204612 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.208950 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.213170 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.220682 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.224726 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.224751 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.224761 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.224773 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.224781 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:12Z","lastTransitionTime":"2025-11-24T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.230290 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.241743 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.249800 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.256656 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.268538 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.277336 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.286231 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.298575 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.314528 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.323913 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.326738 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.326772 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.326784 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.326800 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.326809 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:12Z","lastTransitionTime":"2025-11-24T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.333158 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.342436 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.350084 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.359721 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.372400 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d365d90654de27d78c2b1e068bd42a93b5962a36b32e0f4ca3edd26ef3e8035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.379238 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.387165 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.395201 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.408799 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.416905 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.424696 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.428603 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.428632 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.428666 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.428683 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.428693 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:12Z","lastTransitionTime":"2025-11-24T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.431941 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.441069 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.449081 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.456162 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.464472 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.530526 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.530559 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.530581 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.530597 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.530608 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:12Z","lastTransitionTime":"2025-11-24T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.632786 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.632832 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.632841 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.632858 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.632872 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:12Z","lastTransitionTime":"2025-11-24T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.735198 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.735246 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.735258 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.735277 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.735290 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:12Z","lastTransitionTime":"2025-11-24T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.837364 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.837409 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.837419 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.837436 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.837449 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:12Z","lastTransitionTime":"2025-11-24T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.940005 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.940053 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.940063 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.940086 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.940098 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:12Z","lastTransitionTime":"2025-11-24T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.956870 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.956921 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.956933 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.956949 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.956960 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:12Z","lastTransitionTime":"2025-11-24T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:12 crc kubenswrapper[4563]: E1124 09:04:12.967076 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.970044 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.970079 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.970091 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.970105 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.970116 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:12Z","lastTransitionTime":"2025-11-24T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:12 crc kubenswrapper[4563]: E1124 09:04:12.978123 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.980794 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.980827 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.980836 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.980849 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.980859 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:12Z","lastTransitionTime":"2025-11-24T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:12 crc kubenswrapper[4563]: E1124 09:04:12.989678 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.992163 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.992184 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.992193 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.992202 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:12 crc kubenswrapper[4563]: I1124 09:04:12.992212 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:12Z","lastTransitionTime":"2025-11-24T09:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:13 crc kubenswrapper[4563]: E1124 09:04:12.999932 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:12Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.006612 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.006661 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.006671 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.006682 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.006692 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:13Z","lastTransitionTime":"2025-11-24T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:13 crc kubenswrapper[4563]: E1124 09:04:13.014968 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: E1124 09:04:13.015076 4563 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.041601 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.041631 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.041661 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.041674 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.041681 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:13Z","lastTransitionTime":"2025-11-24T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.064428 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.073382 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.083375 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.093139 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.101698 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.122521 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d365d90654de27d78c2b1e068bd42a93b5962a36b32e0f4ca3edd26ef3e8035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.132234 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.142776 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.142815 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.142825 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.142839 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.142851 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:13Z","lastTransitionTime":"2025-11-24T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.147453 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.157388 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.166818 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.175502 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.182342 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.192250 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.196012 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" event={"ID":"adf08273-4b03-4e6f-8e52-d968b8c98f99","Type":"ContainerStarted","Data":"711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8"} Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.196133 4563 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.196448 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.200765 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.211722 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.216095 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.230009 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.241058 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.245211 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.245261 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.245274 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.245291 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.245304 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:13Z","lastTransitionTime":"2025-11-24T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.251062 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.263845 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.274787 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.283127 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.308293 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.346623 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.347500 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.347547 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.347571 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.347592 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.347606 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:13Z","lastTransitionTime":"2025-11-24T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.389417 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.426517 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.449242 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.449279 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.449292 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.449309 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.449320 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:13Z","lastTransitionTime":"2025-11-24T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.471062 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d365d90654de27d78c2b1e068bd42a93b5962a36b32e0f4ca3edd26ef3e8035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.505834 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.548307 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.551816 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.551848 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.551870 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.551886 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.551895 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:13Z","lastTransitionTime":"2025-11-24T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.595619 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.640817 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.654918 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.654946 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.654953 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.654966 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.654975 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:13Z","lastTransitionTime":"2025-11-24T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.672426 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d365d90654de27d78c2b1e068bd42a93b5962a36b32e0f4ca3edd26ef3e8035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.706770 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.748955 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.757959 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.758017 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.758029 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.758052 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.758065 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:13Z","lastTransitionTime":"2025-11-24T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.788626 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.826695 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.861398 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.861439 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.861449 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.861469 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.861481 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:13Z","lastTransitionTime":"2025-11-24T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.875978 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.908528 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.948112 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.963907 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.963953 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.963963 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.963980 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.963993 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:13Z","lastTransitionTime":"2025-11-24T09:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:13 crc kubenswrapper[4563]: I1124 09:04:13.991276 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.029092 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.030148 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.054373 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.054394 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.054383 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:14 crc kubenswrapper[4563]: E1124 09:04:14.054502 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:14 crc kubenswrapper[4563]: E1124 09:04:14.054611 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:14 crc kubenswrapper[4563]: E1124 09:04:14.054735 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.067229 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.067265 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.067278 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.067292 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.067308 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:14Z","lastTransitionTime":"2025-11-24T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.073055 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.109831 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.147696 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.169404 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.169449 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.169462 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.169480 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.169490 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:14Z","lastTransitionTime":"2025-11-24T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.187926 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.201179 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/0.log" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.203973 4563 generic.go:334] "Generic (PLEG): container finished" podID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerID="7d365d90654de27d78c2b1e068bd42a93b5962a36b32e0f4ca3edd26ef3e8035" exitCode=1 Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.204025 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerDied","Data":"7d365d90654de27d78c2b1e068bd42a93b5962a36b32e0f4ca3edd26ef3e8035"} Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.204667 4563 scope.go:117] "RemoveContainer" containerID="7d365d90654de27d78c2b1e068bd42a93b5962a36b32e0f4ca3edd26ef3e8035" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.227571 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.271520 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.271548 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.271574 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.271592 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.271611 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:14Z","lastTransitionTime":"2025-11-24T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.274622 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.311029 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.347823 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.375001 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.375068 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.375085 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.375115 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.375137 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:14Z","lastTransitionTime":"2025-11-24T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.391120 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.427351 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.466739 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.477414 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.477446 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.477457 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.477477 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.477489 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:14Z","lastTransitionTime":"2025-11-24T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.515747 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.547451 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.580366 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.580410 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.580421 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.580439 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.580452 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:14Z","lastTransitionTime":"2025-11-24T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.599121 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.634420 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.672679 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d365d90654de27d78c2b1e068bd42a93b5962a36b32e0f4ca3edd26ef3e8035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d365d90654de27d78c2b1e068bd42a93b5962a36b32e0f4ca3edd26ef3e8035\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:14Z\\\",\\\"message\\\":\\\"ute/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 09:04:14.126465 5887 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1124 09:04:14.126528 5887 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 09:04:14.126550 5887 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 09:04:14.126557 5887 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 09:04:14.126581 5887 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1124 09:04:14.126585 5887 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1124 09:04:14.126596 5887 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 09:04:14.126606 5887 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1124 09:04:14.126653 5887 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 09:04:14.126620 5887 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 09:04:14.126630 5887 factory.go:656] Stopping watch factory\\\\nI1124 09:04:14.126678 5887 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1124 09:04:14.126681 5887 ovnkube.go:599] Stopped ovnkube\\\\nI1124 09:04:14.126651 5887 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 09:04:14.126718 5887 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.682202 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.682238 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.682248 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.682263 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.682274 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:14Z","lastTransitionTime":"2025-11-24T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.707260 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.749265 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.784232 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.784275 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.784286 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.784303 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.784316 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:14Z","lastTransitionTime":"2025-11-24T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.793455 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.828408 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.886093 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.886143 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.886152 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.886166 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.886177 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:14Z","lastTransitionTime":"2025-11-24T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.989259 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.989308 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.989320 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.989341 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:14 crc kubenswrapper[4563]: I1124 09:04:14.989354 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:14Z","lastTransitionTime":"2025-11-24T09:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.091906 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.091950 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.091962 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.091977 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.091988 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:15Z","lastTransitionTime":"2025-11-24T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.194293 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.194340 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.194354 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.194371 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.194384 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:15Z","lastTransitionTime":"2025-11-24T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.208077 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/1.log" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.208529 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/0.log" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.210192 4563 generic.go:334] "Generic (PLEG): container finished" podID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerID="781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8" exitCode=1 Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.210224 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerDied","Data":"781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8"} Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.210270 4563 scope.go:117] "RemoveContainer" containerID="7d365d90654de27d78c2b1e068bd42a93b5962a36b32e0f4ca3edd26ef3e8035" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.210860 4563 scope.go:117] "RemoveContainer" containerID="781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8" Nov 24 09:04:15 crc kubenswrapper[4563]: E1124 09:04:15.211056 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.226951 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.236778 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.245709 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.254671 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.261525 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.270795 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.279892 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.292849 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.297219 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.297248 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.297267 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.297285 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.297299 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:15Z","lastTransitionTime":"2025-11-24T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.306333 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.315461 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.326444 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.337830 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.346867 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.392940 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d365d90654de27d78c2b1e068bd42a93b5962a36b32e0f4ca3edd26ef3e8035\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:14Z\\\",\\\"message\\\":\\\"ute/v1/apis/informers/externalversions/factory.go:140\\\\nI1124 09:04:14.126465 5887 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1124 09:04:14.126528 5887 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1124 09:04:14.126550 5887 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1124 09:04:14.126557 5887 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1124 09:04:14.126581 5887 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1124 09:04:14.126585 5887 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1124 09:04:14.126596 5887 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1124 09:04:14.126606 5887 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1124 09:04:14.126653 5887 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1124 09:04:14.126620 5887 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1124 09:04:14.126630 5887 factory.go:656] Stopping watch factory\\\\nI1124 09:04:14.126678 5887 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1124 09:04:14.126681 5887 ovnkube.go:599] Stopped ovnkube\\\\nI1124 09:04:14.126651 5887 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1124 09:04:14.126718 5887 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:14Z\\\",\\\"message\\\":\\\"g reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876314 6023 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876546 6023 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876655 6023 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876700 6023 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876804 6023 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.877255 6023 factory.go:656] Stopping watch factory\\\\nI1124 09:04:14.930167 6023 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1124 09:04:14.930193 6023 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1124 09:04:14.930244 6023 ovnkube.go:599] Stopped ovnkube\\\\nI1124 09:04:14.930277 6023 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 09:04:14.930370 6023 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.399431 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.399471 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.399480 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.399497 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.399508 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:15Z","lastTransitionTime":"2025-11-24T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.426019 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:15Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.501116 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.501167 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.501180 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.501203 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.501408 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:15Z","lastTransitionTime":"2025-11-24T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.603751 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.603812 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.603826 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.603852 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.603869 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:15Z","lastTransitionTime":"2025-11-24T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.687895 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:04:15 crc kubenswrapper[4563]: E1124 09:04:15.688079 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:04:31.688060738 +0000 UTC m=+48.947038184 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.706231 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.706268 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.706277 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.706290 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.706299 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:15Z","lastTransitionTime":"2025-11-24T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.788932 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.788999 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.789027 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.789054 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:15 crc kubenswrapper[4563]: E1124 09:04:15.789152 4563 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:04:15 crc kubenswrapper[4563]: E1124 09:04:15.789188 4563 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:04:15 crc kubenswrapper[4563]: E1124 09:04:15.789153 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:04:15 crc kubenswrapper[4563]: E1124 09:04:15.789258 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:04:15 crc kubenswrapper[4563]: E1124 09:04:15.789271 4563 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:15 crc kubenswrapper[4563]: E1124 09:04:15.789215 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:31.789200067 +0000 UTC m=+49.048177515 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:04:15 crc kubenswrapper[4563]: E1124 09:04:15.789324 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:31.789292672 +0000 UTC m=+49.048270120 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:04:15 crc kubenswrapper[4563]: E1124 09:04:15.789351 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:31.789338598 +0000 UTC m=+49.048316046 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:15 crc kubenswrapper[4563]: E1124 09:04:15.789421 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:04:15 crc kubenswrapper[4563]: E1124 09:04:15.789478 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:04:15 crc kubenswrapper[4563]: E1124 09:04:15.789498 4563 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:15 crc kubenswrapper[4563]: E1124 09:04:15.789609 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:31.789578871 +0000 UTC m=+49.048556318 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.809856 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.809914 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.809931 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.809956 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.809971 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:15Z","lastTransitionTime":"2025-11-24T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.912484 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.912546 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.912572 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.912596 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:15 crc kubenswrapper[4563]: I1124 09:04:15.912610 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:15Z","lastTransitionTime":"2025-11-24T09:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.015315 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.015482 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.015569 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.015664 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.015732 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:16Z","lastTransitionTime":"2025-11-24T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.053686 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.053784 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.053758 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:16 crc kubenswrapper[4563]: E1124 09:04:16.054001 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:16 crc kubenswrapper[4563]: E1124 09:04:16.054096 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:16 crc kubenswrapper[4563]: E1124 09:04:16.054207 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.118240 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.118260 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.118268 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.118281 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.118291 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:16Z","lastTransitionTime":"2025-11-24T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.216180 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/1.log" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.219697 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.219732 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.219741 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.219756 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.219765 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:16Z","lastTransitionTime":"2025-11-24T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.220756 4563 scope.go:117] "RemoveContainer" containerID="781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8" Nov 24 09:04:16 crc kubenswrapper[4563]: E1124 09:04:16.220886 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.231507 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.240127 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.252964 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:14Z\\\",\\\"message\\\":\\\"g reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876314 6023 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876546 6023 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876655 6023 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876700 6023 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876804 6023 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.877255 6023 factory.go:656] Stopping watch factory\\\\nI1124 09:04:14.930167 6023 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1124 09:04:14.930193 6023 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1124 09:04:14.930244 6023 ovnkube.go:599] Stopped ovnkube\\\\nI1124 09:04:14.930277 6023 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 09:04:14.930370 6023 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.259908 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.268313 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.276392 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.284100 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.297415 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.307091 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.315182 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.322268 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.322478 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.322544 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.322652 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.322713 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:16Z","lastTransitionTime":"2025-11-24T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.326825 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.335865 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.343377 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.353287 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.361838 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:16Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.424764 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.424811 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.424822 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.424844 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.424858 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:16Z","lastTransitionTime":"2025-11-24T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.527214 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.527374 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.527442 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.527515 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.527583 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:16Z","lastTransitionTime":"2025-11-24T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.629593 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.629631 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.629662 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.629676 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.629687 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:16Z","lastTransitionTime":"2025-11-24T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.732055 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.732114 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.732126 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.732141 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.732150 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:16Z","lastTransitionTime":"2025-11-24T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.834711 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.834760 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.834771 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.834791 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.834803 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:16Z","lastTransitionTime":"2025-11-24T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.936946 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.937051 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.937127 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.937213 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:16 crc kubenswrapper[4563]: I1124 09:04:16.937272 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:16Z","lastTransitionTime":"2025-11-24T09:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.040057 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.040110 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.040121 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.040140 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.040154 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:17Z","lastTransitionTime":"2025-11-24T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.142709 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.142747 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.142761 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.142778 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.142789 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:17Z","lastTransitionTime":"2025-11-24T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.245071 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.245120 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.245132 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.245155 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.245169 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:17Z","lastTransitionTime":"2025-11-24T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.347721 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.347762 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.347772 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.347787 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.347799 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:17Z","lastTransitionTime":"2025-11-24T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.450269 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.450311 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.450320 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.450336 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.450346 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:17Z","lastTransitionTime":"2025-11-24T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.553121 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.553159 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.553171 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.553188 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.553200 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:17Z","lastTransitionTime":"2025-11-24T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.655860 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.655893 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.655903 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.655916 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.655925 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:17Z","lastTransitionTime":"2025-11-24T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.720727 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn"] Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.721386 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.724000 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.724196 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.738151 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:14Z\\\",\\\"message\\\":\\\"g reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876314 6023 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876546 6023 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876655 6023 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876700 6023 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876804 6023 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.877255 6023 factory.go:656] Stopping watch factory\\\\nI1124 09:04:14.930167 6023 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1124 09:04:14.930193 6023 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1124 09:04:14.930244 6023 ovnkube.go:599] Stopped ovnkube\\\\nI1124 09:04:14.930277 6023 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 09:04:14.930370 6023 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.745437 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.756463 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.757793 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.757839 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.757850 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.757863 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.757872 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:17Z","lastTransitionTime":"2025-11-24T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.766722 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.775789 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.788973 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.798163 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.806482 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.807848 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b9c54737-f104-46e9-86b9-0e9ce7915e12-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5ssmn\" (UID: \"b9c54737-f104-46e9-86b9-0e9ce7915e12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.807919 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b9c54737-f104-46e9-86b9-0e9ce7915e12-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5ssmn\" (UID: \"b9c54737-f104-46e9-86b9-0e9ce7915e12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.807945 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbzzq\" (UniqueName: \"kubernetes.io/projected/b9c54737-f104-46e9-86b9-0e9ce7915e12-kube-api-access-zbzzq\") pod \"ovnkube-control-plane-749d76644c-5ssmn\" (UID: \"b9c54737-f104-46e9-86b9-0e9ce7915e12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.807968 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b9c54737-f104-46e9-86b9-0e9ce7915e12-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5ssmn\" (UID: \"b9c54737-f104-46e9-86b9-0e9ce7915e12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.816256 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.823635 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.833121 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.840600 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.849984 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.857864 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.861349 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.861450 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.861473 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.861503 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.861535 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:17Z","lastTransitionTime":"2025-11-24T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.871477 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.879450 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:17Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.909068 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b9c54737-f104-46e9-86b9-0e9ce7915e12-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5ssmn\" (UID: \"b9c54737-f104-46e9-86b9-0e9ce7915e12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.909148 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b9c54737-f104-46e9-86b9-0e9ce7915e12-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5ssmn\" (UID: \"b9c54737-f104-46e9-86b9-0e9ce7915e12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.909177 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbzzq\" (UniqueName: \"kubernetes.io/projected/b9c54737-f104-46e9-86b9-0e9ce7915e12-kube-api-access-zbzzq\") pod \"ovnkube-control-plane-749d76644c-5ssmn\" (UID: \"b9c54737-f104-46e9-86b9-0e9ce7915e12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.909201 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b9c54737-f104-46e9-86b9-0e9ce7915e12-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5ssmn\" (UID: \"b9c54737-f104-46e9-86b9-0e9ce7915e12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.909931 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b9c54737-f104-46e9-86b9-0e9ce7915e12-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5ssmn\" (UID: \"b9c54737-f104-46e9-86b9-0e9ce7915e12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.910035 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b9c54737-f104-46e9-86b9-0e9ce7915e12-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5ssmn\" (UID: \"b9c54737-f104-46e9-86b9-0e9ce7915e12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.913591 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b9c54737-f104-46e9-86b9-0e9ce7915e12-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5ssmn\" (UID: \"b9c54737-f104-46e9-86b9-0e9ce7915e12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.921307 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbzzq\" (UniqueName: \"kubernetes.io/projected/b9c54737-f104-46e9-86b9-0e9ce7915e12-kube-api-access-zbzzq\") pod \"ovnkube-control-plane-749d76644c-5ssmn\" (UID: \"b9c54737-f104-46e9-86b9-0e9ce7915e12\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.964298 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.964362 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.964374 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.964396 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:17 crc kubenswrapper[4563]: I1124 09:04:17.964408 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:17Z","lastTransitionTime":"2025-11-24T09:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.032200 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" Nov 24 09:04:18 crc kubenswrapper[4563]: W1124 09:04:18.045522 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9c54737_f104_46e9_86b9_0e9ce7915e12.slice/crio-8b846eeae24a1de7f0a89b5f4d6c9962cc1e19eb53b6282fde5fdfb234be6c0f WatchSource:0}: Error finding container 8b846eeae24a1de7f0a89b5f4d6c9962cc1e19eb53b6282fde5fdfb234be6c0f: Status 404 returned error can't find the container with id 8b846eeae24a1de7f0a89b5f4d6c9962cc1e19eb53b6282fde5fdfb234be6c0f Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.054655 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:18 crc kubenswrapper[4563]: E1124 09:04:18.054905 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.055022 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:18 crc kubenswrapper[4563]: E1124 09:04:18.055105 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.055242 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:18 crc kubenswrapper[4563]: E1124 09:04:18.055393 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.066433 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.066458 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.066476 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.066491 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.066501 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:18Z","lastTransitionTime":"2025-11-24T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.169665 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.169715 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.169726 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.169740 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.169749 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:18Z","lastTransitionTime":"2025-11-24T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.227491 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" event={"ID":"b9c54737-f104-46e9-86b9-0e9ce7915e12","Type":"ContainerStarted","Data":"3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a"} Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.227670 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" event={"ID":"b9c54737-f104-46e9-86b9-0e9ce7915e12","Type":"ContainerStarted","Data":"8b846eeae24a1de7f0a89b5f4d6c9962cc1e19eb53b6282fde5fdfb234be6c0f"} Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.272607 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.272681 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.272693 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.272711 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.272722 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:18Z","lastTransitionTime":"2025-11-24T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.374778 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.374818 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.374828 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.374842 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.374850 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:18Z","lastTransitionTime":"2025-11-24T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.476392 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.476576 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.476587 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.476601 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.476609 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:18Z","lastTransitionTime":"2025-11-24T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.579595 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.579678 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.579694 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.579722 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.579740 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:18Z","lastTransitionTime":"2025-11-24T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.682173 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.682211 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.682220 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.682247 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.682257 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:18Z","lastTransitionTime":"2025-11-24T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.784540 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.784583 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.784592 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.784603 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.784616 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:18Z","lastTransitionTime":"2025-11-24T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.886485 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.886532 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.886540 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.886561 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.886571 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:18Z","lastTransitionTime":"2025-11-24T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.989275 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.989302 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.989311 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.989322 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:18 crc kubenswrapper[4563]: I1124 09:04:18.989335 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:18Z","lastTransitionTime":"2025-11-24T09:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.091436 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.091481 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.091491 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.091502 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.091512 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:19Z","lastTransitionTime":"2025-11-24T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.138759 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-bsfsd"] Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.139308 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:19 crc kubenswrapper[4563]: E1124 09:04:19.139383 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.151475 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.161517 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.171063 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.185726 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:14Z\\\",\\\"message\\\":\\\"g reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876314 6023 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876546 6023 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876655 6023 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876700 6023 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876804 6023 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.877255 6023 factory.go:656] Stopping watch factory\\\\nI1124 09:04:14.930167 6023 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1124 09:04:14.930193 6023 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1124 09:04:14.930244 6023 ovnkube.go:599] Stopped ovnkube\\\\nI1124 09:04:14.930277 6023 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 09:04:14.930370 6023 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.193200 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.193229 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.193238 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.193251 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.193261 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:19Z","lastTransitionTime":"2025-11-24T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.194216 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.203889 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.213182 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.221772 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4dmd\" (UniqueName: \"kubernetes.io/projected/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-kube-api-access-l4dmd\") pod \"network-metrics-daemon-bsfsd\" (UID: \"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\") " pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.221961 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs\") pod \"network-metrics-daemon-bsfsd\" (UID: \"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\") " pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.229780 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.232489 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" event={"ID":"b9c54737-f104-46e9-86b9-0e9ce7915e12","Type":"ContainerStarted","Data":"f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d"} Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.239879 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.248342 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.257355 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.268915 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.277210 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.285122 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.293851 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.295436 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.295460 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.295472 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.295493 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.295510 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:19Z","lastTransitionTime":"2025-11-24T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.301702 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.310789 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.323098 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs\") pod \"network-metrics-daemon-bsfsd\" (UID: \"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\") " pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.323173 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4dmd\" (UniqueName: \"kubernetes.io/projected/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-kube-api-access-l4dmd\") pod \"network-metrics-daemon-bsfsd\" (UID: \"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\") " pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:19 crc kubenswrapper[4563]: E1124 09:04:19.323255 4563 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:04:19 crc kubenswrapper[4563]: E1124 09:04:19.323308 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs podName:4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:19.82329496 +0000 UTC m=+37.082272406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs") pod "network-metrics-daemon-bsfsd" (UID: "4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.332874 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.342589 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4dmd\" (UniqueName: \"kubernetes.io/projected/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-kube-api-access-l4dmd\") pod \"network-metrics-daemon-bsfsd\" (UID: \"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\") " pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.343359 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.352049 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.361737 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.369587 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.376662 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.384574 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.391084 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.398270 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.398303 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.398313 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.398335 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.398346 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:19Z","lastTransitionTime":"2025-11-24T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.399341 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.406961 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.415844 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.424278 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.440057 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:14Z\\\",\\\"message\\\":\\\"g reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876314 6023 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876546 6023 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876655 6023 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876700 6023 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876804 6023 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.877255 6023 factory.go:656] Stopping watch factory\\\\nI1124 09:04:14.930167 6023 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1124 09:04:14.930193 6023 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1124 09:04:14.930244 6023 ovnkube.go:599] Stopped ovnkube\\\\nI1124 09:04:14.930277 6023 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 09:04:14.930370 6023 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.447434 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.457807 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.468923 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.478826 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.501348 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.501395 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.501406 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.501431 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.501451 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:19Z","lastTransitionTime":"2025-11-24T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.542999 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.555909 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.567080 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.576103 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.588207 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.597147 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.604067 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.604103 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.604114 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.604132 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.604142 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:19Z","lastTransitionTime":"2025-11-24T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.605138 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.614409 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.622821 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.632255 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.646784 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.656274 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.665002 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.678113 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:14Z\\\",\\\"message\\\":\\\"g reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876314 6023 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876546 6023 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876655 6023 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876700 6023 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876804 6023 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.877255 6023 factory.go:656] Stopping watch factory\\\\nI1124 09:04:14.930167 6023 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1124 09:04:14.930193 6023 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1124 09:04:14.930244 6023 ovnkube.go:599] Stopped ovnkube\\\\nI1124 09:04:14.930277 6023 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 09:04:14.930370 6023 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.690218 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.700193 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.706580 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.706622 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.706653 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.706672 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.706681 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:19Z","lastTransitionTime":"2025-11-24T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.711610 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.727301 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:19Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.809012 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.809038 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.809047 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.809061 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.809072 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:19Z","lastTransitionTime":"2025-11-24T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.827787 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs\") pod \"network-metrics-daemon-bsfsd\" (UID: \"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\") " pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:19 crc kubenswrapper[4563]: E1124 09:04:19.827915 4563 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:04:19 crc kubenswrapper[4563]: E1124 09:04:19.827962 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs podName:4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:20.827949466 +0000 UTC m=+38.086926913 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs") pod "network-metrics-daemon-bsfsd" (UID: "4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.911310 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.911349 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.911362 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.911377 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:19 crc kubenswrapper[4563]: I1124 09:04:19.911388 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:19Z","lastTransitionTime":"2025-11-24T09:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.013912 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.013966 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.013981 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.014002 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.014014 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:20Z","lastTransitionTime":"2025-11-24T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.054449 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.054584 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.054483 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:20 crc kubenswrapper[4563]: E1124 09:04:20.054864 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:20 crc kubenswrapper[4563]: E1124 09:04:20.054972 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:20 crc kubenswrapper[4563]: E1124 09:04:20.055135 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.116835 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.116877 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.116887 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.116903 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.116915 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:20Z","lastTransitionTime":"2025-11-24T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.219798 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.219842 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.219853 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.219872 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.219889 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:20Z","lastTransitionTime":"2025-11-24T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.322220 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.322290 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.322304 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.322328 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.322349 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:20Z","lastTransitionTime":"2025-11-24T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.424091 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.424141 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.424152 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.424176 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.424194 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:20Z","lastTransitionTime":"2025-11-24T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.527057 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.527105 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.527114 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.527131 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.527145 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:20Z","lastTransitionTime":"2025-11-24T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.628924 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.628967 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.628976 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.628992 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.629001 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:20Z","lastTransitionTime":"2025-11-24T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.731336 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.731374 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.731387 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.731400 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.731410 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:20Z","lastTransitionTime":"2025-11-24T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.833758 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.833819 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.833831 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.833858 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.833873 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:20Z","lastTransitionTime":"2025-11-24T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.838226 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs\") pod \"network-metrics-daemon-bsfsd\" (UID: \"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\") " pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:20 crc kubenswrapper[4563]: E1124 09:04:20.838393 4563 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:04:20 crc kubenswrapper[4563]: E1124 09:04:20.838494 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs podName:4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:22.838455202 +0000 UTC m=+40.097432649 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs") pod "network-metrics-daemon-bsfsd" (UID: "4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.936000 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.936037 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.936050 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.936064 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:20 crc kubenswrapper[4563]: I1124 09:04:20.936076 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:20Z","lastTransitionTime":"2025-11-24T09:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.037908 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.037953 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.037968 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.037988 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.038001 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:21Z","lastTransitionTime":"2025-11-24T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.054509 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:21 crc kubenswrapper[4563]: E1124 09:04:21.054664 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.139519 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.139562 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.139574 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.139585 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.139598 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:21Z","lastTransitionTime":"2025-11-24T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.241508 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.241559 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.241572 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.241587 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.241598 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:21Z","lastTransitionTime":"2025-11-24T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.347003 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.347079 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.347095 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.347113 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.347130 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:21Z","lastTransitionTime":"2025-11-24T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.449447 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.449524 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.449538 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.449571 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.449584 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:21Z","lastTransitionTime":"2025-11-24T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.551824 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.551879 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.551890 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.551911 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.551923 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:21Z","lastTransitionTime":"2025-11-24T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.654668 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.654707 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.654717 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.654732 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.654743 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:21Z","lastTransitionTime":"2025-11-24T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.758048 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.758082 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.758094 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.758106 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.758116 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:21Z","lastTransitionTime":"2025-11-24T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.860378 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.860421 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.860439 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.860452 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.860460 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:21Z","lastTransitionTime":"2025-11-24T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.963483 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.963569 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.963583 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.963609 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:21 crc kubenswrapper[4563]: I1124 09:04:21.963622 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:21Z","lastTransitionTime":"2025-11-24T09:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.054608 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.054753 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.054722 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:22 crc kubenswrapper[4563]: E1124 09:04:22.054998 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:22 crc kubenswrapper[4563]: E1124 09:04:22.055130 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:22 crc kubenswrapper[4563]: E1124 09:04:22.055351 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.065557 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.065702 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.065778 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.065858 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.065924 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:22Z","lastTransitionTime":"2025-11-24T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.168686 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.168737 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.168751 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.168767 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.168776 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:22Z","lastTransitionTime":"2025-11-24T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.271168 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.271201 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.271212 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.271227 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.271236 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:22Z","lastTransitionTime":"2025-11-24T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.373799 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.373830 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.373841 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.373855 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.373864 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:22Z","lastTransitionTime":"2025-11-24T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.475870 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.475938 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.475949 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.475970 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.475982 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:22Z","lastTransitionTime":"2025-11-24T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.578143 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.578163 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.578172 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.578184 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.578192 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:22Z","lastTransitionTime":"2025-11-24T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.681059 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.681094 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.681105 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.681120 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.681129 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:22Z","lastTransitionTime":"2025-11-24T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.783125 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.783151 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.783162 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.783174 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.783184 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:22Z","lastTransitionTime":"2025-11-24T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.860064 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs\") pod \"network-metrics-daemon-bsfsd\" (UID: \"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\") " pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:22 crc kubenswrapper[4563]: E1124 09:04:22.860248 4563 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:04:22 crc kubenswrapper[4563]: E1124 09:04:22.860319 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs podName:4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:26.860296837 +0000 UTC m=+44.119274273 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs") pod "network-metrics-daemon-bsfsd" (UID: "4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.885789 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.885826 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.885836 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.885855 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.885867 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:22Z","lastTransitionTime":"2025-11-24T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.988363 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.988443 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.988455 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.988469 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:22 crc kubenswrapper[4563]: I1124 09:04:22.988481 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:22Z","lastTransitionTime":"2025-11-24T09:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.054736 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:23 crc kubenswrapper[4563]: E1124 09:04:23.054905 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.073496 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.082868 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.090088 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.090122 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.090134 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.090151 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.090161 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:23Z","lastTransitionTime":"2025-11-24T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.097720 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.107090 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.116405 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.123603 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.131868 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.141290 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.150911 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.158586 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.168083 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.178191 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.188926 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.192066 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.192115 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.192127 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.192143 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.192154 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:23Z","lastTransitionTime":"2025-11-24T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.197980 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.198012 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.198058 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.198083 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.198094 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:23Z","lastTransitionTime":"2025-11-24T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.197984 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.206771 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: E1124 09:04:23.208280 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.210887 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.210932 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.210944 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.210961 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.210972 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:23Z","lastTransitionTime":"2025-11-24T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:23 crc kubenswrapper[4563]: E1124 09:04:23.219440 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.221006 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:14Z\\\",\\\"message\\\":\\\"g reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876314 6023 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876546 6023 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876655 6023 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876700 6023 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876804 6023 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.877255 6023 factory.go:656] Stopping watch factory\\\\nI1124 09:04:14.930167 6023 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1124 09:04:14.930193 6023 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1124 09:04:14.930244 6023 ovnkube.go:599] Stopped ovnkube\\\\nI1124 09:04:14.930277 6023 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 09:04:14.930370 6023 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.222204 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.222237 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.222247 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.222260 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.222269 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:23Z","lastTransitionTime":"2025-11-24T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.230392 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: E1124 09:04:23.231694 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.234287 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.234321 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.234332 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.234347 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.234358 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:23Z","lastTransitionTime":"2025-11-24T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:23 crc kubenswrapper[4563]: E1124 09:04:23.246958 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.250106 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.250145 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.250155 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.250169 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.250177 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:23Z","lastTransitionTime":"2025-11-24T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:23 crc kubenswrapper[4563]: E1124 09:04:23.259201 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:23Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:23 crc kubenswrapper[4563]: E1124 09:04:23.259302 4563 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.293571 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.293620 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.293633 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.293669 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.293690 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:23Z","lastTransitionTime":"2025-11-24T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.395619 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.395675 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.395688 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.395707 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.395718 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:23Z","lastTransitionTime":"2025-11-24T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.498204 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.498259 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.498271 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.498289 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.498304 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:23Z","lastTransitionTime":"2025-11-24T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.601097 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.601137 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.601150 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.601163 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.601175 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:23Z","lastTransitionTime":"2025-11-24T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.704841 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.704885 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.704896 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.704918 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.704929 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:23Z","lastTransitionTime":"2025-11-24T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.808123 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.808188 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.808199 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.808220 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.808238 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:23Z","lastTransitionTime":"2025-11-24T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.912029 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.912080 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.912090 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.912131 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:23 crc kubenswrapper[4563]: I1124 09:04:23.912140 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:23Z","lastTransitionTime":"2025-11-24T09:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.014406 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.014465 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.014481 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.014509 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.014524 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:24Z","lastTransitionTime":"2025-11-24T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.054666 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.054802 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:24 crc kubenswrapper[4563]: E1124 09:04:24.054855 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.054680 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:24 crc kubenswrapper[4563]: E1124 09:04:24.055008 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:24 crc kubenswrapper[4563]: E1124 09:04:24.055240 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.116935 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.116981 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.116991 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.117007 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.117020 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:24Z","lastTransitionTime":"2025-11-24T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.219746 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.219790 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.219802 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.219819 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.219831 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:24Z","lastTransitionTime":"2025-11-24T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.322355 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.322406 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.322417 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.322434 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.322445 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:24Z","lastTransitionTime":"2025-11-24T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.424515 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.424583 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.424598 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.424618 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.424632 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:24Z","lastTransitionTime":"2025-11-24T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.526905 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.526966 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.526977 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.526998 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.527013 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:24Z","lastTransitionTime":"2025-11-24T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.629666 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.629719 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.629730 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.629747 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.629759 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:24Z","lastTransitionTime":"2025-11-24T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.731950 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.732000 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.732010 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.732028 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.732041 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:24Z","lastTransitionTime":"2025-11-24T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.834559 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.834617 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.834626 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.834663 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.834676 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:24Z","lastTransitionTime":"2025-11-24T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.937188 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.937236 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.937249 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.937266 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:24 crc kubenswrapper[4563]: I1124 09:04:24.937275 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:24Z","lastTransitionTime":"2025-11-24T09:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.039300 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.039350 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.039362 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.039385 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.039401 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:25Z","lastTransitionTime":"2025-11-24T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.053806 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:25 crc kubenswrapper[4563]: E1124 09:04:25.053925 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.142048 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.142099 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.142108 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.142125 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.142136 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:25Z","lastTransitionTime":"2025-11-24T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.245322 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.245405 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.245418 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.245438 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.245461 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:25Z","lastTransitionTime":"2025-11-24T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.348331 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.348372 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.348383 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.348399 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.348410 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:25Z","lastTransitionTime":"2025-11-24T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.451471 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.451525 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.451536 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.451569 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.451583 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:25Z","lastTransitionTime":"2025-11-24T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.553949 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.554000 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.554011 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.554030 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.554045 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:25Z","lastTransitionTime":"2025-11-24T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.656868 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.656928 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.656940 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.656961 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.656978 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:25Z","lastTransitionTime":"2025-11-24T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.760517 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.760579 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.760589 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.760605 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.760615 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:25Z","lastTransitionTime":"2025-11-24T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.862877 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.862934 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.862944 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.862966 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.862980 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:25Z","lastTransitionTime":"2025-11-24T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.965764 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.965818 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.965829 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.965854 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:25 crc kubenswrapper[4563]: I1124 09:04:25.965870 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:25Z","lastTransitionTime":"2025-11-24T09:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.054456 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.054493 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.054480 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:26 crc kubenswrapper[4563]: E1124 09:04:26.054660 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:26 crc kubenswrapper[4563]: E1124 09:04:26.054787 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:26 crc kubenswrapper[4563]: E1124 09:04:26.054893 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.068601 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.068659 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.068672 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.068692 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.068704 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:26Z","lastTransitionTime":"2025-11-24T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.171740 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.171806 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.171819 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.171853 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.171868 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:26Z","lastTransitionTime":"2025-11-24T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.274340 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.274406 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.274418 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.274439 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.274454 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:26Z","lastTransitionTime":"2025-11-24T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.377181 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.377243 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.377256 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.377276 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.377291 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:26Z","lastTransitionTime":"2025-11-24T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.479541 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.479618 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.479670 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.479690 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.479703 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:26Z","lastTransitionTime":"2025-11-24T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.582482 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.582532 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.582554 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.582574 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.582586 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:26Z","lastTransitionTime":"2025-11-24T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.684878 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.684963 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.684984 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.685017 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.685033 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:26Z","lastTransitionTime":"2025-11-24T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.787255 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.787348 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.787360 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.787384 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.787401 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:26Z","lastTransitionTime":"2025-11-24T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.889159 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.889242 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.889257 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.889281 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.889292 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:26Z","lastTransitionTime":"2025-11-24T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.897500 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs\") pod \"network-metrics-daemon-bsfsd\" (UID: \"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\") " pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:26 crc kubenswrapper[4563]: E1124 09:04:26.897859 4563 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:04:26 crc kubenswrapper[4563]: E1124 09:04:26.897974 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs podName:4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:34.897944637 +0000 UTC m=+52.156922094 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs") pod "network-metrics-daemon-bsfsd" (UID: "4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.991377 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.991423 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.991435 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.991458 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:26 crc kubenswrapper[4563]: I1124 09:04:26.991471 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:26Z","lastTransitionTime":"2025-11-24T09:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.054464 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:27 crc kubenswrapper[4563]: E1124 09:04:27.054614 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.094295 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.094328 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.094341 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.094359 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.094373 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:27Z","lastTransitionTime":"2025-11-24T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.196179 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.196227 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.196239 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.196258 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.196273 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:27Z","lastTransitionTime":"2025-11-24T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.298467 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.298511 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.298523 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.298550 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.298563 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:27Z","lastTransitionTime":"2025-11-24T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.401450 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.401508 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.401533 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.401566 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.401583 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:27Z","lastTransitionTime":"2025-11-24T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.504429 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.504473 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.504484 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.504500 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.504510 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:27Z","lastTransitionTime":"2025-11-24T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.606471 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.606519 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.606529 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.606554 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.606569 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:27Z","lastTransitionTime":"2025-11-24T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.709907 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.709958 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.709972 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.709992 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.710003 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:27Z","lastTransitionTime":"2025-11-24T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.812705 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.812752 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.812764 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.812778 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.812794 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:27Z","lastTransitionTime":"2025-11-24T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.915428 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.915457 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.915468 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.915480 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:27 crc kubenswrapper[4563]: I1124 09:04:27.915491 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:27Z","lastTransitionTime":"2025-11-24T09:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.018419 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.018475 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.018485 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.018505 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.018517 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:28Z","lastTransitionTime":"2025-11-24T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.054161 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.054198 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.054214 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:28 crc kubenswrapper[4563]: E1124 09:04:28.054316 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:28 crc kubenswrapper[4563]: E1124 09:04:28.054451 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:28 crc kubenswrapper[4563]: E1124 09:04:28.054483 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.120792 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.120826 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.120838 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.120854 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.120868 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:28Z","lastTransitionTime":"2025-11-24T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.223705 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.223749 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.223758 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.223770 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.223778 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:28Z","lastTransitionTime":"2025-11-24T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.326026 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.326059 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.326069 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.326080 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.326087 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:28Z","lastTransitionTime":"2025-11-24T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.428515 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.428576 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.428586 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.428607 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.428621 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:28Z","lastTransitionTime":"2025-11-24T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.531217 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.531247 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.531256 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.531269 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.531277 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:28Z","lastTransitionTime":"2025-11-24T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.633452 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.633495 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.633507 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.633524 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.633537 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:28Z","lastTransitionTime":"2025-11-24T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.735352 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.735392 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.735402 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.735414 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.735425 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:28Z","lastTransitionTime":"2025-11-24T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.837831 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.837891 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.837901 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.837917 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.837929 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:28Z","lastTransitionTime":"2025-11-24T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.939998 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.940025 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.940038 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.940048 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:28 crc kubenswrapper[4563]: I1124 09:04:28.940058 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:28Z","lastTransitionTime":"2025-11-24T09:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.041331 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.041385 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.041396 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.041405 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.041415 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:29Z","lastTransitionTime":"2025-11-24T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.054776 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:29 crc kubenswrapper[4563]: E1124 09:04:29.054910 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.143736 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.143770 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.143778 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.143788 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.143796 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:29Z","lastTransitionTime":"2025-11-24T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.245086 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.245114 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.245122 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.245135 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.245143 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:29Z","lastTransitionTime":"2025-11-24T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.349587 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.349684 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.349702 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.349725 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.349746 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:29Z","lastTransitionTime":"2025-11-24T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.452333 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.452396 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.452414 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.452433 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.452444 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:29Z","lastTransitionTime":"2025-11-24T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.554808 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.554864 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.554876 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.554901 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.554914 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:29Z","lastTransitionTime":"2025-11-24T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.657055 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.657089 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.657100 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.657113 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.657123 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:29Z","lastTransitionTime":"2025-11-24T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.760480 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.760530 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.760552 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.760574 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.760588 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:29Z","lastTransitionTime":"2025-11-24T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.863006 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.863058 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.863092 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.863114 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.863127 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:29Z","lastTransitionTime":"2025-11-24T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.965789 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.965830 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.965841 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.965855 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:29 crc kubenswrapper[4563]: I1124 09:04:29.965865 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:29Z","lastTransitionTime":"2025-11-24T09:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.053811 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.053856 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.053812 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:30 crc kubenswrapper[4563]: E1124 09:04:30.053952 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:30 crc kubenswrapper[4563]: E1124 09:04:30.054089 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:30 crc kubenswrapper[4563]: E1124 09:04:30.054197 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.067977 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.068018 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.068032 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.068048 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.068060 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:30Z","lastTransitionTime":"2025-11-24T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.170327 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.170356 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.170365 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.170380 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.170389 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:30Z","lastTransitionTime":"2025-11-24T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.272687 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.272735 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.272750 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.272772 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.272783 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:30Z","lastTransitionTime":"2025-11-24T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.374649 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.374706 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.374718 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.374736 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.374748 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:30Z","lastTransitionTime":"2025-11-24T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.477160 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.477204 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.477215 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.477227 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.477237 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:30Z","lastTransitionTime":"2025-11-24T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.579122 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.579169 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.579181 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.579199 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.579212 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:30Z","lastTransitionTime":"2025-11-24T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.681448 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.681486 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.681498 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.681510 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.681518 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:30Z","lastTransitionTime":"2025-11-24T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.784014 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.784067 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.784080 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.784102 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.784115 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:30Z","lastTransitionTime":"2025-11-24T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.784296 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.792905 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.798871 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.809073 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.818473 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.831616 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:14Z\\\",\\\"message\\\":\\\"g reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876314 6023 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876546 6023 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876655 6023 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876700 6023 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876804 6023 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.877255 6023 factory.go:656] Stopping watch factory\\\\nI1124 09:04:14.930167 6023 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1124 09:04:14.930193 6023 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1124 09:04:14.930244 6023 ovnkube.go:599] Stopped ovnkube\\\\nI1124 09:04:14.930277 6023 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 09:04:14.930370 6023 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.838454 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.851624 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.860123 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.867970 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.875793 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.883314 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.889691 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.889748 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.889761 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.889777 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.889788 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:30Z","lastTransitionTime":"2025-11-24T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.896967 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.905912 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.917621 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.925842 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.932945 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.942770 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.951169 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:30Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.992505 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.992574 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.992587 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.992608 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:30 crc kubenswrapper[4563]: I1124 09:04:30.992623 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:30Z","lastTransitionTime":"2025-11-24T09:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.054529 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:31 crc kubenswrapper[4563]: E1124 09:04:31.054730 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.055294 4563 scope.go:117] "RemoveContainer" containerID="781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.094806 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.095198 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.095265 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.095339 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.095412 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:31Z","lastTransitionTime":"2025-11-24T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.198104 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.198146 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.198157 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.198181 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.198196 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:31Z","lastTransitionTime":"2025-11-24T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.266084 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/1.log" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.268471 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerStarted","Data":"9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9"} Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.269043 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.279485 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.290483 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.299816 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.299851 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.299861 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.299876 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.299886 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:31Z","lastTransitionTime":"2025-11-24T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.301359 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.309764 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.318896 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.327331 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.341426 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.359990 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.389131 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a72f551-f3fe-4045-933d-cfaa976bd60f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d4c2140ed99b80bf42fb32fc17368efd71c90e29906dd340bcca1a5bb5fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d718684a773c9c1c92ab5722ae89afe44cdd4485b7df17e2d2795ae64645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://265b586b95a5c747bfe4c1a82a536da07d8fded3433c572e7a73ec381565fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.400669 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.401770 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.401800 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.401810 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.401826 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.401836 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:31Z","lastTransitionTime":"2025-11-24T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.418982 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.439800 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:14Z\\\",\\\"message\\\":\\\"g reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876314 6023 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876546 6023 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876655 6023 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876700 6023 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876804 6023 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.877255 6023 factory.go:656] Stopping watch factory\\\\nI1124 09:04:14.930167 6023 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1124 09:04:14.930193 6023 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1124 09:04:14.930244 6023 ovnkube.go:599] Stopped ovnkube\\\\nI1124 09:04:14.930277 6023 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 09:04:14.930370 6023 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.449516 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.461762 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.473562 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.490500 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.501822 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.504462 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.504511 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.504523 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.504551 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.504576 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:31Z","lastTransitionTime":"2025-11-24T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.513234 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:31Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.607479 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.607523 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.607533 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.607868 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.607879 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:31Z","lastTransitionTime":"2025-11-24T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.709954 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.710003 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.710019 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.710035 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.710044 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:31Z","lastTransitionTime":"2025-11-24T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.747212 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:04:31 crc kubenswrapper[4563]: E1124 09:04:31.747345 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:05:03.747318185 +0000 UTC m=+81.006295632 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.812008 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.812051 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.812061 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.812078 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.812089 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:31Z","lastTransitionTime":"2025-11-24T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.848053 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.848097 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.848119 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.848150 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:31 crc kubenswrapper[4563]: E1124 09:04:31.848251 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:04:31 crc kubenswrapper[4563]: E1124 09:04:31.848265 4563 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:04:31 crc kubenswrapper[4563]: E1124 09:04:31.848282 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:04:31 crc kubenswrapper[4563]: E1124 09:04:31.848291 4563 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:04:31 crc kubenswrapper[4563]: E1124 09:04:31.848330 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:05:03.848315087 +0000 UTC m=+81.107292534 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:04:31 crc kubenswrapper[4563]: E1124 09:04:31.848353 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:04:31 crc kubenswrapper[4563]: E1124 09:04:31.848391 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:04:31 crc kubenswrapper[4563]: E1124 09:04:31.848407 4563 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:31 crc kubenswrapper[4563]: E1124 09:04:31.848300 4563 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:31 crc kubenswrapper[4563]: E1124 09:04:31.848372 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:05:03.848354391 +0000 UTC m=+81.107331838 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:04:31 crc kubenswrapper[4563]: E1124 09:04:31.848466 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 09:05:03.848458076 +0000 UTC m=+81.107435523 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:31 crc kubenswrapper[4563]: E1124 09:04:31.848503 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 09:05:03.848479667 +0000 UTC m=+81.107457114 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.914242 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.914282 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.914294 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.914319 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:31 crc kubenswrapper[4563]: I1124 09:04:31.914336 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:31Z","lastTransitionTime":"2025-11-24T09:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.016741 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.016791 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.016802 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.016821 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.016835 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:32Z","lastTransitionTime":"2025-11-24T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.054362 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.054362 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.054515 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:32 crc kubenswrapper[4563]: E1124 09:04:32.054624 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:32 crc kubenswrapper[4563]: E1124 09:04:32.054726 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:32 crc kubenswrapper[4563]: E1124 09:04:32.054865 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.119569 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.119603 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.119614 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.119629 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.119656 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:32Z","lastTransitionTime":"2025-11-24T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.222330 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.222370 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.222382 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.222395 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.222404 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:32Z","lastTransitionTime":"2025-11-24T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.273066 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/2.log" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.274591 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/1.log" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.277862 4563 generic.go:334] "Generic (PLEG): container finished" podID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerID="9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9" exitCode=1 Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.277913 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerDied","Data":"9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9"} Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.277968 4563 scope.go:117] "RemoveContainer" containerID="781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.278858 4563 scope.go:117] "RemoveContainer" containerID="9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9" Nov 24 09:04:32 crc kubenswrapper[4563]: E1124 09:04:32.279566 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.296448 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.306891 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.317103 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.325119 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.325181 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.325198 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.325230 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.325280 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:32Z","lastTransitionTime":"2025-11-24T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.329469 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.337829 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.347100 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.356011 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.368821 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.378294 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.390045 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.403615 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.412712 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a72f551-f3fe-4045-933d-cfaa976bd60f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d4c2140ed99b80bf42fb32fc17368efd71c90e29906dd340bcca1a5bb5fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d718684a773c9c1c92ab5722ae89afe44cdd4485b7df17e2d2795ae64645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://265b586b95a5c747bfe4c1a82a536da07d8fded3433c572e7a73ec381565fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.420947 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.428127 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.428172 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.428183 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.428200 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.428210 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:32Z","lastTransitionTime":"2025-11-24T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.430024 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.439711 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.448201 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.460915 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:14Z\\\",\\\"message\\\":\\\"g reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876314 6023 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876546 6023 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876655 6023 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876700 6023 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876804 6023 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.877255 6023 factory.go:656] Stopping watch factory\\\\nI1124 09:04:14.930167 6023 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1124 09:04:14.930193 6023 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1124 09:04:14.930244 6023 ovnkube.go:599] Stopped ovnkube\\\\nI1124 09:04:14.930277 6023 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 09:04:14.930370 6023 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:31Z\\\",\\\"message\\\":\\\"anager/kube-controller-manager-crc\\\\nI1124 09:04:31.786010 6247 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1124 09:04:31.786019 6247 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1124 09:04:31.786026 6247 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nI1124 09:04:31.785854 6247 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-bsfsd] creating logical port openshift-multus_network-metrics-daemon-bsfsd for pod on switch crc\\\\nI1124 09:04:31.786041 6247 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nF1124 09:04:31.786044 6247 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.468190 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:32Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.530930 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.530970 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.530983 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.531008 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.531021 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:32Z","lastTransitionTime":"2025-11-24T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.633421 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.633459 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.633468 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.633485 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.633495 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:32Z","lastTransitionTime":"2025-11-24T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.735519 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.735586 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.735598 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.735615 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.735629 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:32Z","lastTransitionTime":"2025-11-24T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.838682 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.838734 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.838746 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.838766 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.838779 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:32Z","lastTransitionTime":"2025-11-24T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.940891 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.940927 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.940941 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.940957 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:32 crc kubenswrapper[4563]: I1124 09:04:32.940967 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:32Z","lastTransitionTime":"2025-11-24T09:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.043231 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.043283 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.043293 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.043313 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.043327 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:33Z","lastTransitionTime":"2025-11-24T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.054555 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:33 crc kubenswrapper[4563]: E1124 09:04:33.054707 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.066146 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.076301 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.086978 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.097840 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.106789 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.116162 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.126931 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.135518 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a72f551-f3fe-4045-933d-cfaa976bd60f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d4c2140ed99b80bf42fb32fc17368efd71c90e29906dd340bcca1a5bb5fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d718684a773c9c1c92ab5722ae89afe44cdd4485b7df17e2d2795ae64645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://265b586b95a5c747bfe4c1a82a536da07d8fded3433c572e7a73ec381565fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.144419 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.145867 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.145916 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.145927 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.145948 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.145960 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:33Z","lastTransitionTime":"2025-11-24T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.156482 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.166205 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.175356 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.185190 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.204050 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://781774c6161f09cef82afcca790bdec5fe0b402f1522104c3e628ddd61510db8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:14Z\\\",\\\"message\\\":\\\"g reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876314 6023 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876546 6023 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1124 09:04:14.876655 6023 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876700 6023 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.876804 6023 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1124 09:04:14.877255 6023 factory.go:656] Stopping watch factory\\\\nI1124 09:04:14.930167 6023 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1124 09:04:14.930193 6023 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1124 09:04:14.930244 6023 ovnkube.go:599] Stopped ovnkube\\\\nI1124 09:04:14.930277 6023 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1124 09:04:14.930370 6023 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:31Z\\\",\\\"message\\\":\\\"anager/kube-controller-manager-crc\\\\nI1124 09:04:31.786010 6247 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1124 09:04:31.786019 6247 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1124 09:04:31.786026 6247 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nI1124 09:04:31.785854 6247 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-bsfsd] creating logical port openshift-multus_network-metrics-daemon-bsfsd for pod on switch crc\\\\nI1124 09:04:31.786041 6247 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nF1124 09:04:31.786044 6247 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.212783 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.221865 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.230332 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.244880 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.247833 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.247873 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.247886 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.247904 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.247918 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:33Z","lastTransitionTime":"2025-11-24T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.283384 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/2.log" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.286910 4563 scope.go:117] "RemoveContainer" containerID="9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9" Nov 24 09:04:33 crc kubenswrapper[4563]: E1124 09:04:33.287217 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.301557 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.310155 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.319045 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.327122 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.335548 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.335585 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.335597 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.335617 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.335632 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:33Z","lastTransitionTime":"2025-11-24T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.339321 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: E1124 09:04:33.346388 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.348773 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.350461 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.350508 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.350522 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.350554 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.350568 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:33Z","lastTransitionTime":"2025-11-24T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.356508 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: E1124 09:04:33.360050 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.363342 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.363386 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.363398 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.363418 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.363431 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:33Z","lastTransitionTime":"2025-11-24T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.367406 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: E1124 09:04:33.376068 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.378574 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.378624 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.378663 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.378675 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.378683 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:33Z","lastTransitionTime":"2025-11-24T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.379722 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: E1124 09:04:33.387425 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.388678 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.390050 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.390084 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.390096 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.390113 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.390123 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:33Z","lastTransitionTime":"2025-11-24T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.398323 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: E1124 09:04:33.400076 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: E1124 09:04:33.400193 4563 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.401470 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.401506 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.401520 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.401545 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.401556 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:33Z","lastTransitionTime":"2025-11-24T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.406485 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a72f551-f3fe-4045-933d-cfaa976bd60f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d4c2140ed99b80bf42fb32fc17368efd71c90e29906dd340bcca1a5bb5fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d718684a773c9c1c92ab5722ae89afe44cdd4485b7df17e2d2795ae64645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://265b586b95a5c747bfe4c1a82a536da07d8fded3433c572e7a73ec381565fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.414454 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.424290 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.436562 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:31Z\\\",\\\"message\\\":\\\"anager/kube-controller-manager-crc\\\\nI1124 09:04:31.786010 6247 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1124 09:04:31.786019 6247 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1124 09:04:31.786026 6247 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nI1124 09:04:31.785854 6247 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-bsfsd] creating logical port openshift-multus_network-metrics-daemon-bsfsd for pod on switch crc\\\\nI1124 09:04:31.786041 6247 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nF1124 09:04:31.786044 6247 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.443777 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.452703 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.462172 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:33Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.504107 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.504171 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.504189 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.504215 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.504228 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:33Z","lastTransitionTime":"2025-11-24T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.606872 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.606914 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.606941 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.606962 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.606977 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:33Z","lastTransitionTime":"2025-11-24T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.709467 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.709519 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.709541 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.709559 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.709573 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:33Z","lastTransitionTime":"2025-11-24T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.811611 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.811671 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.811683 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.811696 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.811707 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:33Z","lastTransitionTime":"2025-11-24T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.913723 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.913762 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.913773 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.913787 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:33 crc kubenswrapper[4563]: I1124 09:04:33.913797 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:33Z","lastTransitionTime":"2025-11-24T09:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.015677 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.015713 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.015722 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.015732 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.015740 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:34Z","lastTransitionTime":"2025-11-24T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.054528 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.054599 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.054658 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:34 crc kubenswrapper[4563]: E1124 09:04:34.054729 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:34 crc kubenswrapper[4563]: E1124 09:04:34.054864 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:34 crc kubenswrapper[4563]: E1124 09:04:34.054979 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.118201 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.118279 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.118295 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.118307 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.118314 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:34Z","lastTransitionTime":"2025-11-24T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.220181 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.220232 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.220243 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.220256 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.220265 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:34Z","lastTransitionTime":"2025-11-24T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.322063 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.322100 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.322112 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.322129 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.322140 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:34Z","lastTransitionTime":"2025-11-24T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.424123 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.424157 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.424169 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.424181 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.424193 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:34Z","lastTransitionTime":"2025-11-24T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.525992 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.526050 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.526059 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.526075 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.526082 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:34Z","lastTransitionTime":"2025-11-24T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.628200 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.628236 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.628245 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.628257 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.628266 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:34Z","lastTransitionTime":"2025-11-24T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.730665 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.730702 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.730712 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.730723 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.730733 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:34Z","lastTransitionTime":"2025-11-24T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.832858 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.832884 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.832892 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.832902 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.832911 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:34Z","lastTransitionTime":"2025-11-24T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.934458 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.934493 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.934504 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.934513 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.934525 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:34Z","lastTransitionTime":"2025-11-24T09:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:34 crc kubenswrapper[4563]: I1124 09:04:34.976460 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs\") pod \"network-metrics-daemon-bsfsd\" (UID: \"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\") " pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:34 crc kubenswrapper[4563]: E1124 09:04:34.976676 4563 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:04:34 crc kubenswrapper[4563]: E1124 09:04:34.976784 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs podName:4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0 nodeName:}" failed. No retries permitted until 2025-11-24 09:04:50.976756926 +0000 UTC m=+68.235734373 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs") pod "network-metrics-daemon-bsfsd" (UID: "4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.036998 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.037043 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.037056 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.037072 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.037081 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:35Z","lastTransitionTime":"2025-11-24T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.054802 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:35 crc kubenswrapper[4563]: E1124 09:04:35.054969 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.140001 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.140048 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.140059 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.140076 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.140089 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:35Z","lastTransitionTime":"2025-11-24T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.242797 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.242842 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.242856 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.242873 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.242885 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:35Z","lastTransitionTime":"2025-11-24T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.345374 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.345422 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.345433 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.345453 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.345468 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:35Z","lastTransitionTime":"2025-11-24T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.448102 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.448171 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.448182 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.448209 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.448222 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:35Z","lastTransitionTime":"2025-11-24T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.551324 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.551374 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.551387 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.551406 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.551424 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:35Z","lastTransitionTime":"2025-11-24T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.654997 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.655048 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.655060 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.655087 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.655103 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:35Z","lastTransitionTime":"2025-11-24T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.758051 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.758097 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.758110 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.758133 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.758145 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:35Z","lastTransitionTime":"2025-11-24T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.860717 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.860769 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.860780 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.860797 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.860808 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:35Z","lastTransitionTime":"2025-11-24T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.962521 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.962576 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.962587 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.962609 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:35 crc kubenswrapper[4563]: I1124 09:04:35.962623 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:35Z","lastTransitionTime":"2025-11-24T09:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.054687 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.054758 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.054793 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:36 crc kubenswrapper[4563]: E1124 09:04:36.054875 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:36 crc kubenswrapper[4563]: E1124 09:04:36.054979 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:36 crc kubenswrapper[4563]: E1124 09:04:36.055120 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.064761 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.064795 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.064804 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.064819 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.064837 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:36Z","lastTransitionTime":"2025-11-24T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.168172 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.168214 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.168224 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.168242 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.168252 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:36Z","lastTransitionTime":"2025-11-24T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.270733 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.270773 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.270782 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.270796 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.270807 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:36Z","lastTransitionTime":"2025-11-24T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.373142 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.373200 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.373210 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.373229 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.373245 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:36Z","lastTransitionTime":"2025-11-24T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.475893 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.475930 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.475940 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.475954 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.475963 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:36Z","lastTransitionTime":"2025-11-24T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.578434 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.578477 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.578485 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.578510 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.578518 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:36Z","lastTransitionTime":"2025-11-24T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.680654 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.680695 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.680704 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.680717 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.680726 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:36Z","lastTransitionTime":"2025-11-24T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.782925 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.783274 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.783441 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.783624 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.783792 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:36Z","lastTransitionTime":"2025-11-24T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.886139 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.886174 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.886184 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.886199 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.886210 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:36Z","lastTransitionTime":"2025-11-24T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.989253 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.989302 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.989311 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.989324 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:36 crc kubenswrapper[4563]: I1124 09:04:36.989332 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:36Z","lastTransitionTime":"2025-11-24T09:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.053950 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:37 crc kubenswrapper[4563]: E1124 09:04:37.054086 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.091572 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.091613 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.091622 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.091655 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.091665 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:37Z","lastTransitionTime":"2025-11-24T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.193293 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.193582 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.193672 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.193749 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.193811 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:37Z","lastTransitionTime":"2025-11-24T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.296351 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.296401 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.296409 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.296440 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.296448 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:37Z","lastTransitionTime":"2025-11-24T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.398829 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.398866 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.398875 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.398888 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.398897 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:37Z","lastTransitionTime":"2025-11-24T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.502108 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.502149 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.502162 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.502178 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.502186 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:37Z","lastTransitionTime":"2025-11-24T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.604187 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.604220 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.604228 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.604242 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.604260 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:37Z","lastTransitionTime":"2025-11-24T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.707085 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.707132 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.707141 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.707159 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.707169 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:37Z","lastTransitionTime":"2025-11-24T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.809448 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.809476 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.809484 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.809496 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.809504 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:37Z","lastTransitionTime":"2025-11-24T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.912345 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.912383 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.912401 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.912417 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:37 crc kubenswrapper[4563]: I1124 09:04:37.912426 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:37Z","lastTransitionTime":"2025-11-24T09:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.014955 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.015027 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.015037 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.015051 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.015061 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:38Z","lastTransitionTime":"2025-11-24T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.054003 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.054043 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.054075 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:38 crc kubenswrapper[4563]: E1124 09:04:38.054102 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:38 crc kubenswrapper[4563]: E1124 09:04:38.054172 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:38 crc kubenswrapper[4563]: E1124 09:04:38.054270 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.116772 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.116822 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.116833 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.116845 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.116854 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:38Z","lastTransitionTime":"2025-11-24T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.219869 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.219913 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.219924 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.219939 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.219948 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:38Z","lastTransitionTime":"2025-11-24T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.322279 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.322337 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.322348 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.322360 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.322369 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:38Z","lastTransitionTime":"2025-11-24T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.424800 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.424838 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.424849 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.424864 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.424873 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:38Z","lastTransitionTime":"2025-11-24T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.527135 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.527170 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.527179 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.527191 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.527201 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:38Z","lastTransitionTime":"2025-11-24T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.629682 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.629720 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.629729 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.629745 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.629754 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:38Z","lastTransitionTime":"2025-11-24T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.732791 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.732837 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.732847 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.732861 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.732870 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:38Z","lastTransitionTime":"2025-11-24T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.835842 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.835880 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.835888 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.835902 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.835911 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:38Z","lastTransitionTime":"2025-11-24T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.938371 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.938408 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.938417 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.938439 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:38 crc kubenswrapper[4563]: I1124 09:04:38.938448 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:38Z","lastTransitionTime":"2025-11-24T09:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.040441 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.040479 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.040490 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.040505 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.040513 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:39Z","lastTransitionTime":"2025-11-24T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.053875 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:39 crc kubenswrapper[4563]: E1124 09:04:39.054003 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.142271 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.142311 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.142322 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.142337 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.142348 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:39Z","lastTransitionTime":"2025-11-24T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.244608 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.244681 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.244700 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.244715 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.244724 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:39Z","lastTransitionTime":"2025-11-24T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.347169 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.347198 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.347207 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.347218 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.347228 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:39Z","lastTransitionTime":"2025-11-24T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.449514 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.449560 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.449569 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.449579 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.449587 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:39Z","lastTransitionTime":"2025-11-24T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.551297 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.551328 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.551336 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.551346 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.551354 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:39Z","lastTransitionTime":"2025-11-24T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.653591 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.653622 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.653631 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.653659 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.653668 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:39Z","lastTransitionTime":"2025-11-24T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.755904 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.755930 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.755939 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.755950 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.755957 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:39Z","lastTransitionTime":"2025-11-24T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.858371 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.858400 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.858408 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.858418 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.858425 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:39Z","lastTransitionTime":"2025-11-24T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.960496 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.960549 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.960559 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.960575 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:39 crc kubenswrapper[4563]: I1124 09:04:39.960585 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:39Z","lastTransitionTime":"2025-11-24T09:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.054631 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.054800 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:40 crc kubenswrapper[4563]: E1124 09:04:40.054996 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.055270 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:40 crc kubenswrapper[4563]: E1124 09:04:40.055399 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:40 crc kubenswrapper[4563]: E1124 09:04:40.055486 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.063212 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.063261 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.063275 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.063297 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.063310 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:40Z","lastTransitionTime":"2025-11-24T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.166348 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.166391 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.166401 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.166415 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.166428 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:40Z","lastTransitionTime":"2025-11-24T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.268898 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.268931 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.268942 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.268959 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.268971 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:40Z","lastTransitionTime":"2025-11-24T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.371149 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.371330 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.371399 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.371517 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.371589 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:40Z","lastTransitionTime":"2025-11-24T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.473349 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.473393 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.473403 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.473414 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.473422 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:40Z","lastTransitionTime":"2025-11-24T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.575308 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.575341 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.575353 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.575364 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.575373 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:40Z","lastTransitionTime":"2025-11-24T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.677314 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.677339 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.677346 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.677356 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.677363 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:40Z","lastTransitionTime":"2025-11-24T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.779388 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.779411 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.779422 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.779432 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.779440 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:40Z","lastTransitionTime":"2025-11-24T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.882191 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.882246 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.882259 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.882284 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.882295 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:40Z","lastTransitionTime":"2025-11-24T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.984540 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.984575 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.984584 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.984606 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:40 crc kubenswrapper[4563]: I1124 09:04:40.984616 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:40Z","lastTransitionTime":"2025-11-24T09:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.054909 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:41 crc kubenswrapper[4563]: E1124 09:04:41.055042 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.086768 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.086828 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.086840 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.086862 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.086877 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:41Z","lastTransitionTime":"2025-11-24T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.189351 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.189392 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.189402 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.189415 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.189425 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:41Z","lastTransitionTime":"2025-11-24T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.291451 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.291501 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.291512 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.291534 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.291550 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:41Z","lastTransitionTime":"2025-11-24T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.392707 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.392745 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.392755 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.392766 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.392775 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:41Z","lastTransitionTime":"2025-11-24T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.494811 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.494847 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.494859 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.494869 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.494877 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:41Z","lastTransitionTime":"2025-11-24T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.597011 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.597079 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.597091 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.597113 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.597126 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:41Z","lastTransitionTime":"2025-11-24T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.699549 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.699613 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.699628 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.699665 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.699686 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:41Z","lastTransitionTime":"2025-11-24T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.801881 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.801920 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.801930 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.801943 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.801955 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:41Z","lastTransitionTime":"2025-11-24T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.904121 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.904168 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.904180 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.904191 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:41 crc kubenswrapper[4563]: I1124 09:04:41.904199 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:41Z","lastTransitionTime":"2025-11-24T09:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.006214 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.006275 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.006287 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.006312 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.006325 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:42Z","lastTransitionTime":"2025-11-24T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.054069 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.054150 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:42 crc kubenswrapper[4563]: E1124 09:04:42.054179 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.054247 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:42 crc kubenswrapper[4563]: E1124 09:04:42.054291 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:42 crc kubenswrapper[4563]: E1124 09:04:42.054404 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.108735 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.108775 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.108786 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.108803 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.108818 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:42Z","lastTransitionTime":"2025-11-24T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.211391 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.211437 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.211447 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.211463 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.211475 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:42Z","lastTransitionTime":"2025-11-24T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.313168 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.313231 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.313242 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.313254 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.313266 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:42Z","lastTransitionTime":"2025-11-24T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.415120 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.415150 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.415162 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.415173 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.415182 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:42Z","lastTransitionTime":"2025-11-24T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.518144 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.518172 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.518183 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.518216 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.518227 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:42Z","lastTransitionTime":"2025-11-24T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.625036 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.625117 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.625159 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.625178 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.625195 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:42Z","lastTransitionTime":"2025-11-24T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.727963 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.728144 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.728215 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.728289 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.728358 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:42Z","lastTransitionTime":"2025-11-24T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.830221 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.830266 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.830278 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.830299 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.830311 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:42Z","lastTransitionTime":"2025-11-24T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.933558 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.933602 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.933614 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.933631 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:42 crc kubenswrapper[4563]: I1124 09:04:42.933663 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:42Z","lastTransitionTime":"2025-11-24T09:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.035684 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.035730 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.035740 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.035758 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.035770 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.053788 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:43 crc kubenswrapper[4563]: E1124 09:04:43.053971 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.065377 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.075960 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.084811 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.094786 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.102841 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.113490 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.129919 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.137850 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.137892 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.137905 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.137928 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.137946 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.143173 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.153034 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a72f551-f3fe-4045-933d-cfaa976bd60f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d4c2140ed99b80bf42fb32fc17368efd71c90e29906dd340bcca1a5bb5fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d718684a773c9c1c92ab5722ae89afe44cdd4485b7df17e2d2795ae64645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://265b586b95a5c747bfe4c1a82a536da07d8fded3433c572e7a73ec381565fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.162351 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.171897 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.181590 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.191436 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.205307 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:31Z\\\",\\\"message\\\":\\\"anager/kube-controller-manager-crc\\\\nI1124 09:04:31.786010 6247 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1124 09:04:31.786019 6247 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1124 09:04:31.786026 6247 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nI1124 09:04:31.785854 6247 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-bsfsd] creating logical port openshift-multus_network-metrics-daemon-bsfsd for pod on switch crc\\\\nI1124 09:04:31.786041 6247 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nF1124 09:04:31.786044 6247 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.212943 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.226833 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.242270 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.242327 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.242341 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.242366 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.242381 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.246265 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.264931 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.345030 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.345088 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.345097 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.345113 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.345124 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.447616 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.447699 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.447710 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.447728 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.447739 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.550465 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.550583 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.550595 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.550621 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.550658 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.652678 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.652720 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.652733 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.652747 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.652756 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.755534 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.755593 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.755604 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.755631 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.755662 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.760679 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.760726 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.760742 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.760760 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.760772 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: E1124 09:04:43.773386 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.776744 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.776785 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.776795 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.776809 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.776818 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: E1124 09:04:43.786077 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.788985 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.789033 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.789045 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.789063 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.789073 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: E1124 09:04:43.799073 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.802040 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.802076 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.802086 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.802104 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.802116 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: E1124 09:04:43.811891 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.814678 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.814721 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.814732 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.814744 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.814753 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: E1124 09:04:43.823309 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:43Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:43 crc kubenswrapper[4563]: E1124 09:04:43.823425 4563 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.859666 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.859722 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.859739 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.859757 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.859777 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.962221 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.962254 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.962266 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.962278 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:43 crc kubenswrapper[4563]: I1124 09:04:43.962287 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:43Z","lastTransitionTime":"2025-11-24T09:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.054672 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.054709 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.054748 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:44 crc kubenswrapper[4563]: E1124 09:04:44.054873 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:44 crc kubenswrapper[4563]: E1124 09:04:44.054926 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:44 crc kubenswrapper[4563]: E1124 09:04:44.054986 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.065777 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.065808 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.065821 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.065834 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.065846 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:44Z","lastTransitionTime":"2025-11-24T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.167807 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.167847 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.167858 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.167874 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.167885 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:44Z","lastTransitionTime":"2025-11-24T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.270068 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.270127 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.270139 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.270156 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.270167 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:44Z","lastTransitionTime":"2025-11-24T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.371970 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.372006 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.372015 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.372028 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.372039 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:44Z","lastTransitionTime":"2025-11-24T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.474809 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.474917 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.474931 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.474941 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.474949 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:44Z","lastTransitionTime":"2025-11-24T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.577067 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.577096 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.577104 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.577113 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.577121 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:44Z","lastTransitionTime":"2025-11-24T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.679535 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.679619 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.679652 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.679680 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.679693 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:44Z","lastTransitionTime":"2025-11-24T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.782176 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.782232 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.782244 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.782268 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.782287 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:44Z","lastTransitionTime":"2025-11-24T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.884549 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.884604 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.884618 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.884660 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.884675 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:44Z","lastTransitionTime":"2025-11-24T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.986846 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.986905 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.986921 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.986941 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:44 crc kubenswrapper[4563]: I1124 09:04:44.986953 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:44Z","lastTransitionTime":"2025-11-24T09:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.054215 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:45 crc kubenswrapper[4563]: E1124 09:04:45.054575 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.055137 4563 scope.go:117] "RemoveContainer" containerID="9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9" Nov 24 09:04:45 crc kubenswrapper[4563]: E1124 09:04:45.055361 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.089460 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.089532 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.089549 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.089571 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.089585 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:45Z","lastTransitionTime":"2025-11-24T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.191838 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.191882 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.191892 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.191906 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.191915 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:45Z","lastTransitionTime":"2025-11-24T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.294214 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.294279 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.294290 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.294311 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.294325 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:45Z","lastTransitionTime":"2025-11-24T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.396725 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.396790 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.396803 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.396841 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.396857 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:45Z","lastTransitionTime":"2025-11-24T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.498950 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.499002 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.499012 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.499030 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.499041 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:45Z","lastTransitionTime":"2025-11-24T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.601426 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.601477 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.601488 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.601507 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.601529 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:45Z","lastTransitionTime":"2025-11-24T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.703249 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.703287 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.703297 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.703308 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.703316 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:45Z","lastTransitionTime":"2025-11-24T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.806112 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.806174 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.806189 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.806214 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.806226 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:45Z","lastTransitionTime":"2025-11-24T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.908235 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.908277 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.908287 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.908302 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:45 crc kubenswrapper[4563]: I1124 09:04:45.908313 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:45Z","lastTransitionTime":"2025-11-24T09:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.010828 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.010906 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.010920 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.010952 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.010965 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:46Z","lastTransitionTime":"2025-11-24T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.054345 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.054357 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:46 crc kubenswrapper[4563]: E1124 09:04:46.054449 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.054501 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:46 crc kubenswrapper[4563]: E1124 09:04:46.054722 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:46 crc kubenswrapper[4563]: E1124 09:04:46.054804 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.113852 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.113913 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.113931 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.113955 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.113968 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:46Z","lastTransitionTime":"2025-11-24T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.216794 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.216854 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.216867 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.216889 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.216902 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:46Z","lastTransitionTime":"2025-11-24T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.319843 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.319896 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.319907 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.319923 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.319936 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:46Z","lastTransitionTime":"2025-11-24T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.422592 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.422631 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.422660 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.422674 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.422684 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:46Z","lastTransitionTime":"2025-11-24T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.524728 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.524777 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.524788 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.524805 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.524816 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:46Z","lastTransitionTime":"2025-11-24T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.627116 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.627150 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.627160 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.627175 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.627183 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:46Z","lastTransitionTime":"2025-11-24T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.729703 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.729742 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.729754 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.729768 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.729781 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:46Z","lastTransitionTime":"2025-11-24T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.831545 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.831584 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.831594 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.831607 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.831616 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:46Z","lastTransitionTime":"2025-11-24T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.933949 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.933974 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.933982 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.933993 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:46 crc kubenswrapper[4563]: I1124 09:04:46.934003 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:46Z","lastTransitionTime":"2025-11-24T09:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.035612 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.035884 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.035893 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.035905 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.035914 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:47Z","lastTransitionTime":"2025-11-24T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.054560 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:47 crc kubenswrapper[4563]: E1124 09:04:47.054698 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.138107 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.138143 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.138152 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.138166 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.138178 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:47Z","lastTransitionTime":"2025-11-24T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.239716 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.239756 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.239766 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.239780 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.239790 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:47Z","lastTransitionTime":"2025-11-24T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.342339 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.342394 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.342404 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.342421 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.342433 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:47Z","lastTransitionTime":"2025-11-24T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.444358 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.444411 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.444420 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.444437 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.444449 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:47Z","lastTransitionTime":"2025-11-24T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.547430 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.547469 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.547482 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.547498 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.547508 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:47Z","lastTransitionTime":"2025-11-24T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.650031 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.650118 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.650128 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.650142 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.650152 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:47Z","lastTransitionTime":"2025-11-24T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.752211 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.752256 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.752266 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.752284 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.752297 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:47Z","lastTransitionTime":"2025-11-24T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.854797 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.854828 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.854838 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.854850 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.854860 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:47Z","lastTransitionTime":"2025-11-24T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.957790 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.957820 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.957832 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.957847 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:47 crc kubenswrapper[4563]: I1124 09:04:47.957859 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:47Z","lastTransitionTime":"2025-11-24T09:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.053936 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.054024 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.054114 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:48 crc kubenswrapper[4563]: E1124 09:04:48.054211 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:48 crc kubenswrapper[4563]: E1124 09:04:48.054392 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:48 crc kubenswrapper[4563]: E1124 09:04:48.054525 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.060812 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.060872 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.060885 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.060897 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.060905 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:48Z","lastTransitionTime":"2025-11-24T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.163184 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.163223 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.163232 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.163249 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.163262 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:48Z","lastTransitionTime":"2025-11-24T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.265480 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.265553 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.265567 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.265582 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.265594 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:48Z","lastTransitionTime":"2025-11-24T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.367720 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.367797 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.367808 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.368046 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.368059 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:48Z","lastTransitionTime":"2025-11-24T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.470837 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.470886 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.470897 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.470918 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.470938 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:48Z","lastTransitionTime":"2025-11-24T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.573793 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.573834 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.573843 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.573860 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.573871 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:48Z","lastTransitionTime":"2025-11-24T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.676722 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.676765 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.676776 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.676793 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.676804 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:48Z","lastTransitionTime":"2025-11-24T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.779618 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.779732 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.779744 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.779760 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.779771 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:48Z","lastTransitionTime":"2025-11-24T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.882130 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.882168 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.882179 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.882196 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.882206 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:48Z","lastTransitionTime":"2025-11-24T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.984704 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.984749 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.984760 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.984779 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:48 crc kubenswrapper[4563]: I1124 09:04:48.984794 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:48Z","lastTransitionTime":"2025-11-24T09:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.054743 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:49 crc kubenswrapper[4563]: E1124 09:04:49.054891 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.087047 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.087092 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.087104 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.087117 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.087131 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:49Z","lastTransitionTime":"2025-11-24T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.189585 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.189634 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.189659 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.189678 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.189688 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:49Z","lastTransitionTime":"2025-11-24T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.291703 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.291751 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.291763 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.291779 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.291790 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:49Z","lastTransitionTime":"2025-11-24T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.394500 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.394550 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.394560 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.394573 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.394587 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:49Z","lastTransitionTime":"2025-11-24T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.497249 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.497303 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.497315 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.497339 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.497353 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:49Z","lastTransitionTime":"2025-11-24T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.600506 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.600580 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.600592 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.600614 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.600625 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:49Z","lastTransitionTime":"2025-11-24T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.703569 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.703615 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.703626 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.703663 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.703674 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:49Z","lastTransitionTime":"2025-11-24T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.806676 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.806727 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.806736 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.806758 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.806769 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:49Z","lastTransitionTime":"2025-11-24T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.909816 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.909867 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.909878 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.909895 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:49 crc kubenswrapper[4563]: I1124 09:04:49.909907 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:49Z","lastTransitionTime":"2025-11-24T09:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.012688 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.012738 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.012749 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.012770 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.012782 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:50Z","lastTransitionTime":"2025-11-24T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.054463 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.054533 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:50 crc kubenswrapper[4563]: E1124 09:04:50.054598 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.054687 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:50 crc kubenswrapper[4563]: E1124 09:04:50.054787 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:50 crc kubenswrapper[4563]: E1124 09:04:50.054927 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.114910 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.114952 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.114962 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.114979 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.114989 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:50Z","lastTransitionTime":"2025-11-24T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.216938 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.216993 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.217004 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.217025 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.217035 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:50Z","lastTransitionTime":"2025-11-24T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.319453 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.319499 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.319508 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.319535 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.319544 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:50Z","lastTransitionTime":"2025-11-24T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.421489 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.421562 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.421572 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.421597 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.421611 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:50Z","lastTransitionTime":"2025-11-24T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.525317 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.525410 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.525426 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.525455 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.525468 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:50Z","lastTransitionTime":"2025-11-24T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.627391 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.627433 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.627444 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.627463 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.627474 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:50Z","lastTransitionTime":"2025-11-24T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.730455 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.730522 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.730533 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.730553 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.730570 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:50Z","lastTransitionTime":"2025-11-24T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.833682 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.833741 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.833751 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.833769 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.833784 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:50Z","lastTransitionTime":"2025-11-24T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.935887 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.935913 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.935923 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.935937 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:50 crc kubenswrapper[4563]: I1124 09:04:50.935946 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:50Z","lastTransitionTime":"2025-11-24T09:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.035392 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs\") pod \"network-metrics-daemon-bsfsd\" (UID: \"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\") " pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:51 crc kubenswrapper[4563]: E1124 09:04:51.035545 4563 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:04:51 crc kubenswrapper[4563]: E1124 09:04:51.035617 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs podName:4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0 nodeName:}" failed. No retries permitted until 2025-11-24 09:05:23.035598007 +0000 UTC m=+100.294575454 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs") pod "network-metrics-daemon-bsfsd" (UID: "4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.037431 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.037482 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.037493 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.037506 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.037531 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:51Z","lastTransitionTime":"2025-11-24T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.054310 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:51 crc kubenswrapper[4563]: E1124 09:04:51.054457 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.143695 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.143757 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.143771 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.143794 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.143809 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:51Z","lastTransitionTime":"2025-11-24T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.246083 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.246124 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.246133 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.246150 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.246163 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:51Z","lastTransitionTime":"2025-11-24T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.348500 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.348546 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.348558 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.348573 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.348585 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:51Z","lastTransitionTime":"2025-11-24T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.450948 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.451063 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.451134 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.451207 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.451288 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:51Z","lastTransitionTime":"2025-11-24T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.553174 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.553211 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.553221 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.553235 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.553247 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:51Z","lastTransitionTime":"2025-11-24T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.656008 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.656054 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.656069 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.656086 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.656095 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:51Z","lastTransitionTime":"2025-11-24T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.757787 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.757822 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.757831 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.757863 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.757874 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:51Z","lastTransitionTime":"2025-11-24T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.859788 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.859837 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.859849 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.859860 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.859869 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:51Z","lastTransitionTime":"2025-11-24T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.961781 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.961830 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.961842 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.961854 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:51 crc kubenswrapper[4563]: I1124 09:04:51.961862 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:51Z","lastTransitionTime":"2025-11-24T09:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.053667 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.053666 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:52 crc kubenswrapper[4563]: E1124 09:04:52.054171 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:52 crc kubenswrapper[4563]: E1124 09:04:52.054263 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.053682 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:52 crc kubenswrapper[4563]: E1124 09:04:52.054543 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.063981 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.064078 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.064141 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.064207 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.064271 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:52Z","lastTransitionTime":"2025-11-24T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.166359 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.166396 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.166405 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.166420 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.166431 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:52Z","lastTransitionTime":"2025-11-24T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.268418 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.268678 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.268751 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.268831 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.268903 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:52Z","lastTransitionTime":"2025-11-24T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.370913 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.370952 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.370962 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.370978 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.370988 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:52Z","lastTransitionTime":"2025-11-24T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.473211 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.473243 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.473253 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.473266 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.473275 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:52Z","lastTransitionTime":"2025-11-24T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.574969 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.575010 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.575021 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.575037 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.575047 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:52Z","lastTransitionTime":"2025-11-24T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.676533 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.676571 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.676581 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.676594 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.676605 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:52Z","lastTransitionTime":"2025-11-24T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.778799 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.778836 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.778845 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.778862 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.778872 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:52Z","lastTransitionTime":"2025-11-24T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.880780 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.880816 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.880827 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.880840 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.880853 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:52Z","lastTransitionTime":"2025-11-24T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.983097 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.983144 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.983155 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.983172 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:52 crc kubenswrapper[4563]: I1124 09:04:52.983183 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:52Z","lastTransitionTime":"2025-11-24T09:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.054120 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:53 crc kubenswrapper[4563]: E1124 09:04:53.054269 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.071052 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.080841 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.085863 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.085901 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.085913 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.085926 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.085937 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:53Z","lastTransitionTime":"2025-11-24T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.090838 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.100586 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.108781 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.117246 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.129990 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.138972 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.147412 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.157142 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.167145 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.175507 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a72f551-f3fe-4045-933d-cfaa976bd60f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d4c2140ed99b80bf42fb32fc17368efd71c90e29906dd340bcca1a5bb5fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d718684a773c9c1c92ab5722ae89afe44cdd4485b7df17e2d2795ae64645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://265b586b95a5c747bfe4c1a82a536da07d8fded3433c572e7a73ec381565fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.183242 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.188174 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.188221 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.188234 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.188256 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.188269 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:53Z","lastTransitionTime":"2025-11-24T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.190782 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.199652 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.209174 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.217589 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.231239 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:31Z\\\",\\\"message\\\":\\\"anager/kube-controller-manager-crc\\\\nI1124 09:04:31.786010 6247 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1124 09:04:31.786019 6247 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1124 09:04:31.786026 6247 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nI1124 09:04:31.785854 6247 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-bsfsd] creating logical port openshift-multus_network-metrics-daemon-bsfsd for pod on switch crc\\\\nI1124 09:04:31.786041 6247 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nF1124 09:04:31.786044 6247 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.289807 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.289846 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.289859 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.289873 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.289883 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:53Z","lastTransitionTime":"2025-11-24T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.344591 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nw8xd_019bd805-9123-494a-bb29-f39b924e6243/kube-multus/0.log" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.344674 4563 generic.go:334] "Generic (PLEG): container finished" podID="019bd805-9123-494a-bb29-f39b924e6243" containerID="6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b" exitCode=1 Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.344708 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nw8xd" event={"ID":"019bd805-9123-494a-bb29-f39b924e6243","Type":"ContainerDied","Data":"6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b"} Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.345060 4563 scope.go:117] "RemoveContainer" containerID="6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.364156 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.374765 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.384076 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.391655 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.391701 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.391714 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.391731 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.391744 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:53Z","lastTransitionTime":"2025-11-24T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.393161 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.402671 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.411735 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:52Z\\\",\\\"message\\\":\\\"2025-11-24T09:04:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_64d4d46d-e0a0-449c-a1b8-1ee3b6c10b3c\\\\n2025-11-24T09:04:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_64d4d46d-e0a0-449c-a1b8-1ee3b6c10b3c to /host/opt/cni/bin/\\\\n2025-11-24T09:04:07Z [verbose] multus-daemon started\\\\n2025-11-24T09:04:07Z [verbose] Readiness Indicator file check\\\\n2025-11-24T09:04:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.419920 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.432336 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.441736 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.452189 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.462618 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.471857 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a72f551-f3fe-4045-933d-cfaa976bd60f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d4c2140ed99b80bf42fb32fc17368efd71c90e29906dd340bcca1a5bb5fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d718684a773c9c1c92ab5722ae89afe44cdd4485b7df17e2d2795ae64645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://265b586b95a5c747bfe4c1a82a536da07d8fded3433c572e7a73ec381565fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.486013 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.494275 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.494310 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.494321 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.494338 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.494350 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:53Z","lastTransitionTime":"2025-11-24T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.496163 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.505546 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.515020 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.529023 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:31Z\\\",\\\"message\\\":\\\"anager/kube-controller-manager-crc\\\\nI1124 09:04:31.786010 6247 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1124 09:04:31.786019 6247 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1124 09:04:31.786026 6247 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nI1124 09:04:31.785854 6247 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-bsfsd] creating logical port openshift-multus_network-metrics-daemon-bsfsd for pod on switch crc\\\\nI1124 09:04:31.786041 6247 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nF1124 09:04:31.786044 6247 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.536971 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.596830 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.596866 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.596877 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.596892 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.596900 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:53Z","lastTransitionTime":"2025-11-24T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.699039 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.699077 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.699086 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.699102 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.699113 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:53Z","lastTransitionTime":"2025-11-24T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.801835 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.801863 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.801872 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.801885 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.801892 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:53Z","lastTransitionTime":"2025-11-24T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.903815 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.903843 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.903853 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.903863 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.903872 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:53Z","lastTransitionTime":"2025-11-24T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.919701 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.919758 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.919771 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.919783 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.919792 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:53Z","lastTransitionTime":"2025-11-24T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: E1124 09:04:53.931234 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.934146 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.934204 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.934217 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.934234 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.934244 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:53Z","lastTransitionTime":"2025-11-24T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: E1124 09:04:53.943592 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.946385 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.946418 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.946428 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.946440 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.946449 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:53Z","lastTransitionTime":"2025-11-24T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: E1124 09:04:53.954938 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.957279 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.957306 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.957316 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.957328 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.957335 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:53Z","lastTransitionTime":"2025-11-24T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: E1124 09:04:53.965350 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.967800 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.967831 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.967841 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.967852 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:53 crc kubenswrapper[4563]: I1124 09:04:53.967859 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:53Z","lastTransitionTime":"2025-11-24T09:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:53 crc kubenswrapper[4563]: E1124 09:04:53.975431 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:53Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:53 crc kubenswrapper[4563]: E1124 09:04:53.975541 4563 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.005595 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.005717 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.005779 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.005850 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.005918 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:54Z","lastTransitionTime":"2025-11-24T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.054162 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.054239 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:54 crc kubenswrapper[4563]: E1124 09:04:54.054262 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.054171 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:54 crc kubenswrapper[4563]: E1124 09:04:54.054369 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:54 crc kubenswrapper[4563]: E1124 09:04:54.054395 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.108045 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.108090 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.108103 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.108119 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.108129 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:54Z","lastTransitionTime":"2025-11-24T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.210502 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.210537 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.210545 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.210556 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.210564 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:54Z","lastTransitionTime":"2025-11-24T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.312922 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.312963 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.312973 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.312989 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.313001 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:54Z","lastTransitionTime":"2025-11-24T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.349128 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nw8xd_019bd805-9123-494a-bb29-f39b924e6243/kube-multus/0.log" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.349172 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nw8xd" event={"ID":"019bd805-9123-494a-bb29-f39b924e6243","Type":"ContainerStarted","Data":"381c5f62c655111b7df341bae96a5edef6bcd2d5c3a8758d07465c278445bb8a"} Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.360780 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.368835 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.378125 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381c5f62c655111b7df341bae96a5edef6bcd2d5c3a8758d07465c278445bb8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:52Z\\\",\\\"message\\\":\\\"2025-11-24T09:04:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_64d4d46d-e0a0-449c-a1b8-1ee3b6c10b3c\\\\n2025-11-24T09:04:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_64d4d46d-e0a0-449c-a1b8-1ee3b6c10b3c to /host/opt/cni/bin/\\\\n2025-11-24T09:04:07Z [verbose] multus-daemon started\\\\n2025-11-24T09:04:07Z [verbose] Readiness Indicator file check\\\\n2025-11-24T09:04:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.386448 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.396447 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.404601 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.412540 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.414827 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.414858 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.414868 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.414879 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.414887 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:54Z","lastTransitionTime":"2025-11-24T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.423202 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.432080 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a72f551-f3fe-4045-933d-cfaa976bd60f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d4c2140ed99b80bf42fb32fc17368efd71c90e29906dd340bcca1a5bb5fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d718684a773c9c1c92ab5722ae89afe44cdd4485b7df17e2d2795ae64645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://265b586b95a5c747bfe4c1a82a536da07d8fded3433c572e7a73ec381565fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.440338 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.449254 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.458350 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.473622 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.487553 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:31Z\\\",\\\"message\\\":\\\"anager/kube-controller-manager-crc\\\\nI1124 09:04:31.786010 6247 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1124 09:04:31.786019 6247 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1124 09:04:31.786026 6247 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nI1124 09:04:31.785854 6247 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-bsfsd] creating logical port openshift-multus_network-metrics-daemon-bsfsd for pod on switch crc\\\\nI1124 09:04:31.786041 6247 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nF1124 09:04:31.786044 6247 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.497047 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.512317 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.517716 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.517744 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.517755 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.517770 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.517779 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:54Z","lastTransitionTime":"2025-11-24T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.522496 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.532400 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:54Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.620262 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.620305 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.620317 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.620331 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.620347 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:54Z","lastTransitionTime":"2025-11-24T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.723136 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.723168 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.723176 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.723188 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.723199 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:54Z","lastTransitionTime":"2025-11-24T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.825139 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.825184 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.825193 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.825210 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.825220 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:54Z","lastTransitionTime":"2025-11-24T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.927830 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.927872 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.927882 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.927894 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:54 crc kubenswrapper[4563]: I1124 09:04:54.927902 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:54Z","lastTransitionTime":"2025-11-24T09:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.030813 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.030872 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.030884 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.030906 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.030924 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:55Z","lastTransitionTime":"2025-11-24T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.054349 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:55 crc kubenswrapper[4563]: E1124 09:04:55.054508 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.132743 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.132797 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.132810 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.132828 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.132841 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:55Z","lastTransitionTime":"2025-11-24T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.235429 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.235473 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.235502 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.235538 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.235554 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:55Z","lastTransitionTime":"2025-11-24T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.337655 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.337693 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.337701 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.337716 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.337726 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:55Z","lastTransitionTime":"2025-11-24T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.439558 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.439596 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.439605 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.439618 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.439627 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:55Z","lastTransitionTime":"2025-11-24T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.542161 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.542211 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.542222 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.542233 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.542242 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:55Z","lastTransitionTime":"2025-11-24T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.644588 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.644626 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.644651 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.644664 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.644673 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:55Z","lastTransitionTime":"2025-11-24T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.747532 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.747573 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.747582 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.747598 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.747609 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:55Z","lastTransitionTime":"2025-11-24T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.851279 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.851367 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.851382 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.851421 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.851445 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:55Z","lastTransitionTime":"2025-11-24T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.954480 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.954531 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.954542 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.954568 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:55 crc kubenswrapper[4563]: I1124 09:04:55.954576 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:55Z","lastTransitionTime":"2025-11-24T09:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.054319 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.054455 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.054574 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:56 crc kubenswrapper[4563]: E1124 09:04:56.054459 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:56 crc kubenswrapper[4563]: E1124 09:04:56.054794 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:56 crc kubenswrapper[4563]: E1124 09:04:56.054825 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.057316 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.057346 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.057358 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.057382 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.057411 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:56Z","lastTransitionTime":"2025-11-24T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.159189 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.159226 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.159240 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.159255 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.159268 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:56Z","lastTransitionTime":"2025-11-24T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.261427 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.261466 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.261476 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.261493 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.261503 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:56Z","lastTransitionTime":"2025-11-24T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.362797 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.362823 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.362832 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.362845 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.362856 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:56Z","lastTransitionTime":"2025-11-24T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.465105 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.465131 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.465139 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.465151 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.465160 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:56Z","lastTransitionTime":"2025-11-24T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.567205 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.567238 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.567247 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.567262 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.567272 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:56Z","lastTransitionTime":"2025-11-24T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.669227 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.669264 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.669272 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.669285 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.669295 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:56Z","lastTransitionTime":"2025-11-24T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.771873 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.771949 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.771962 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.771988 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.772000 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:56Z","lastTransitionTime":"2025-11-24T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.874944 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.874976 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.874985 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.874999 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.875009 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:56Z","lastTransitionTime":"2025-11-24T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.977714 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.977752 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.977764 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.977776 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:56 crc kubenswrapper[4563]: I1124 09:04:56.977784 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:56Z","lastTransitionTime":"2025-11-24T09:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.053770 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:57 crc kubenswrapper[4563]: E1124 09:04:57.053913 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.079784 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.079812 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.079821 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.079835 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.079844 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:57Z","lastTransitionTime":"2025-11-24T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.182018 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.182087 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.182099 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.182121 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.182134 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:57Z","lastTransitionTime":"2025-11-24T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.285425 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.285469 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.285479 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.285495 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.285521 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:57Z","lastTransitionTime":"2025-11-24T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.387990 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.388037 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.388049 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.388065 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.388075 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:57Z","lastTransitionTime":"2025-11-24T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.490889 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.490943 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.490957 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.490973 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.490982 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:57Z","lastTransitionTime":"2025-11-24T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.593066 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.593114 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.593123 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.593139 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.593151 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:57Z","lastTransitionTime":"2025-11-24T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.695427 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.695461 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.695470 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.695482 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.695492 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:57Z","lastTransitionTime":"2025-11-24T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.797438 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.797473 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.797482 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.797495 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.797504 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:57Z","lastTransitionTime":"2025-11-24T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.899323 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.899353 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.899363 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.899374 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:57 crc kubenswrapper[4563]: I1124 09:04:57.899382 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:57Z","lastTransitionTime":"2025-11-24T09:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.001488 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.001533 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.001543 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.001554 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.001565 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:58Z","lastTransitionTime":"2025-11-24T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.054443 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.054466 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.054485 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:04:58 crc kubenswrapper[4563]: E1124 09:04:58.054539 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:04:58 crc kubenswrapper[4563]: E1124 09:04:58.054628 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:04:58 crc kubenswrapper[4563]: E1124 09:04:58.054719 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.103278 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.103306 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.103316 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.103327 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.103336 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:58Z","lastTransitionTime":"2025-11-24T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.204998 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.205041 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.205054 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.205068 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.205077 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:58Z","lastTransitionTime":"2025-11-24T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.307202 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.307232 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.307242 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.307254 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.307262 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:58Z","lastTransitionTime":"2025-11-24T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.409576 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.409605 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.409614 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.409627 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.409647 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:58Z","lastTransitionTime":"2025-11-24T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.511764 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.511796 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.511805 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.511817 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.511825 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:58Z","lastTransitionTime":"2025-11-24T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.613850 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.613895 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.613907 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.613922 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.613933 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:58Z","lastTransitionTime":"2025-11-24T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.716447 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.716485 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.716497 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.716512 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.716535 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:58Z","lastTransitionTime":"2025-11-24T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.818664 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.818710 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.818723 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.818740 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.818751 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:58Z","lastTransitionTime":"2025-11-24T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.922171 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.922206 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.922215 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.922229 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:58 crc kubenswrapper[4563]: I1124 09:04:58.922240 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:58Z","lastTransitionTime":"2025-11-24T09:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.024787 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.024831 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.024840 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.024854 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.024863 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:59Z","lastTransitionTime":"2025-11-24T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.054291 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:04:59 crc kubenswrapper[4563]: E1124 09:04:59.054406 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.055117 4563 scope.go:117] "RemoveContainer" containerID="9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.127138 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.127166 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.127175 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.127190 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.127200 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:59Z","lastTransitionTime":"2025-11-24T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.229140 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.229180 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.229190 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.229204 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.229214 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:59Z","lastTransitionTime":"2025-11-24T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.331889 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.331933 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.331945 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.331961 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.331971 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:59Z","lastTransitionTime":"2025-11-24T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.363597 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/2.log" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.366017 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerStarted","Data":"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772"} Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.366441 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.381460 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.392005 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.400177 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.407189 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.415702 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.425478 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.434469 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.434512 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.434533 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.434550 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.434562 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:59Z","lastTransitionTime":"2025-11-24T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.436380 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381c5f62c655111b7df341bae96a5edef6bcd2d5c3a8758d07465c278445bb8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:52Z\\\",\\\"message\\\":\\\"2025-11-24T09:04:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_64d4d46d-e0a0-449c-a1b8-1ee3b6c10b3c\\\\n2025-11-24T09:04:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_64d4d46d-e0a0-449c-a1b8-1ee3b6c10b3c to /host/opt/cni/bin/\\\\n2025-11-24T09:04:07Z [verbose] multus-daemon started\\\\n2025-11-24T09:04:07Z [verbose] Readiness Indicator file check\\\\n2025-11-24T09:04:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.444387 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.454240 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.462021 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.471388 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.482081 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a72f551-f3fe-4045-933d-cfaa976bd60f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d4c2140ed99b80bf42fb32fc17368efd71c90e29906dd340bcca1a5bb5fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d718684a773c9c1c92ab5722ae89afe44cdd4485b7df17e2d2795ae64645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://265b586b95a5c747bfe4c1a82a536da07d8fded3433c572e7a73ec381565fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.491095 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.501731 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.513730 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.523739 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.537741 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.537779 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.537792 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.537812 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.537825 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:59Z","lastTransitionTime":"2025-11-24T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.538096 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:31Z\\\",\\\"message\\\":\\\"anager/kube-controller-manager-crc\\\\nI1124 09:04:31.786010 6247 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1124 09:04:31.786019 6247 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1124 09:04:31.786026 6247 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nI1124 09:04:31.785854 6247 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-bsfsd] creating logical port openshift-multus_network-metrics-daemon-bsfsd for pod on switch crc\\\\nI1124 09:04:31.786041 6247 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nF1124 09:04:31.786044 6247 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.547394 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:04:59Z is after 2025-08-24T17:21:41Z" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.639890 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.639927 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.639939 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.639955 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.639964 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:59Z","lastTransitionTime":"2025-11-24T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.742140 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.742175 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.742183 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.742201 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.742210 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:59Z","lastTransitionTime":"2025-11-24T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.844884 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.844933 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.844943 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.844960 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.844970 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:59Z","lastTransitionTime":"2025-11-24T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.946912 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.946939 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.946964 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.946977 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:04:59 crc kubenswrapper[4563]: I1124 09:04:59.946985 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:04:59Z","lastTransitionTime":"2025-11-24T09:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.049006 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.049041 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.049049 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.049062 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.049071 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:00Z","lastTransitionTime":"2025-11-24T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.054531 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.054531 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.054608 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:00 crc kubenswrapper[4563]: E1124 09:05:00.054720 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:00 crc kubenswrapper[4563]: E1124 09:05:00.054819 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:00 crc kubenswrapper[4563]: E1124 09:05:00.054866 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.150751 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.150803 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.150813 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.150826 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.150834 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:00Z","lastTransitionTime":"2025-11-24T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.252128 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.252154 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.252164 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.252176 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.252185 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:00Z","lastTransitionTime":"2025-11-24T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.354400 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.354451 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.354463 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.354474 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.354482 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:00Z","lastTransitionTime":"2025-11-24T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.371444 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/3.log" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.372111 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/2.log" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.374923 4563 generic.go:334] "Generic (PLEG): container finished" podID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerID="d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772" exitCode=1 Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.374956 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerDied","Data":"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772"} Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.374990 4563 scope.go:117] "RemoveContainer" containerID="9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.375392 4563 scope.go:117] "RemoveContainer" containerID="d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772" Nov 24 09:05:00 crc kubenswrapper[4563]: E1124 09:05:00.375526 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.394751 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.405929 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.420193 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.429714 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.438265 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.447056 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.454097 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.457156 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.457184 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.457193 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.457207 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.457215 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:00Z","lastTransitionTime":"2025-11-24T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.464399 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381c5f62c655111b7df341bae96a5edef6bcd2d5c3a8758d07465c278445bb8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:52Z\\\",\\\"message\\\":\\\"2025-11-24T09:04:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_64d4d46d-e0a0-449c-a1b8-1ee3b6c10b3c\\\\n2025-11-24T09:04:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_64d4d46d-e0a0-449c-a1b8-1ee3b6c10b3c to /host/opt/cni/bin/\\\\n2025-11-24T09:04:07Z [verbose] multus-daemon started\\\\n2025-11-24T09:04:07Z [verbose] Readiness Indicator file check\\\\n2025-11-24T09:04:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.472377 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.482196 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.491425 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.500951 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a72f551-f3fe-4045-933d-cfaa976bd60f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d4c2140ed99b80bf42fb32fc17368efd71c90e29906dd340bcca1a5bb5fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d718684a773c9c1c92ab5722ae89afe44cdd4485b7df17e2d2795ae64645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://265b586b95a5c747bfe4c1a82a536da07d8fded3433c572e7a73ec381565fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.509045 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.515691 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.524946 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.533355 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.541447 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.553064 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9676783ae9e3137d08ce88ad456e6c8964cb4529991cf5cac3e0cf0ec71841f9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:31Z\\\",\\\"message\\\":\\\"anager/kube-controller-manager-crc\\\\nI1124 09:04:31.786010 6247 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1124 09:04:31.786019 6247 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1124 09:04:31.786026 6247 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nI1124 09:04:31.785854 6247 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-bsfsd] creating logical port openshift-multus_network-metrics-daemon-bsfsd for pod on switch crc\\\\nI1124 09:04:31.786041 6247 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-vgbgr\\\\nF1124 09:04:31.786044 6247 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed ca\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:59Z\\\",\\\"message\\\":\\\"IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1124 09:04:59.693301 6607 services_controller.go:444] Built service openshift-oauth-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1124 09:04:59.693307 6607 services_controller.go:445] Built service openshift-oauth-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nI1124 09:04:59.691434 6607 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-stlxr after 0 failed attempt(s)\\\\nI1124 09:04:59.693371 6607 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-stlxr\\\\nF1124 09:04:59.693373 6607 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:00Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.559677 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.559729 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.559743 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.559768 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.559784 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:00Z","lastTransitionTime":"2025-11-24T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.662537 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.662577 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.662587 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.662603 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.662614 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:00Z","lastTransitionTime":"2025-11-24T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.764390 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.764419 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.764429 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.764444 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.764456 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:00Z","lastTransitionTime":"2025-11-24T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.866434 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.866492 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.866501 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.866526 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.866536 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:00Z","lastTransitionTime":"2025-11-24T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.969284 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.969345 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.969358 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.969382 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:00 crc kubenswrapper[4563]: I1124 09:05:00.969395 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:00Z","lastTransitionTime":"2025-11-24T09:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.054353 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:01 crc kubenswrapper[4563]: E1124 09:05:01.054589 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.071968 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.072003 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.072012 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.072027 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.072036 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:01Z","lastTransitionTime":"2025-11-24T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.174457 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.174488 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.174496 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.174506 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.174515 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:01Z","lastTransitionTime":"2025-11-24T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.276617 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.276675 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.276686 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.276698 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.276707 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:01Z","lastTransitionTime":"2025-11-24T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.377960 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.378002 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.378010 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.378025 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.378034 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:01Z","lastTransitionTime":"2025-11-24T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.378702 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/3.log" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.381415 4563 scope.go:117] "RemoveContainer" containerID="d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772" Nov 24 09:05:01 crc kubenswrapper[4563]: E1124 09:05:01.381548 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.396194 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.404232 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.411484 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.418589 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.425096 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.433006 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381c5f62c655111b7df341bae96a5edef6bcd2d5c3a8758d07465c278445bb8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:52Z\\\",\\\"message\\\":\\\"2025-11-24T09:04:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_64d4d46d-e0a0-449c-a1b8-1ee3b6c10b3c\\\\n2025-11-24T09:04:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_64d4d46d-e0a0-449c-a1b8-1ee3b6c10b3c to /host/opt/cni/bin/\\\\n2025-11-24T09:04:07Z [verbose] multus-daemon started\\\\n2025-11-24T09:04:07Z [verbose] Readiness Indicator file check\\\\n2025-11-24T09:04:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.439598 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.448774 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.455749 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.462279 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.470389 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.477585 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a72f551-f3fe-4045-933d-cfaa976bd60f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d4c2140ed99b80bf42fb32fc17368efd71c90e29906dd340bcca1a5bb5fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d718684a773c9c1c92ab5722ae89afe44cdd4485b7df17e2d2795ae64645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://265b586b95a5c747bfe4c1a82a536da07d8fded3433c572e7a73ec381565fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.479935 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.480032 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.480103 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.480167 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.480237 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:01Z","lastTransitionTime":"2025-11-24T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.484702 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.492478 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.500445 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.509120 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.525249 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:59Z\\\",\\\"message\\\":\\\"IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1124 09:04:59.693301 6607 services_controller.go:444] Built service openshift-oauth-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1124 09:04:59.693307 6607 services_controller.go:445] Built service openshift-oauth-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nI1124 09:04:59.691434 6607 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-stlxr after 0 failed attempt(s)\\\\nI1124 09:04:59.693371 6607 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-stlxr\\\\nF1124 09:04:59.693373 6607 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.532089 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:01Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.583677 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.583720 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.583730 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.583757 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.583766 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:01Z","lastTransitionTime":"2025-11-24T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.685726 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.685758 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.685768 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.685782 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.685791 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:01Z","lastTransitionTime":"2025-11-24T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.788573 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.788608 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.788616 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.788653 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.788665 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:01Z","lastTransitionTime":"2025-11-24T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.891326 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.891355 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.891365 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.891380 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.891391 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:01Z","lastTransitionTime":"2025-11-24T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.993751 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.993788 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.993800 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.993816 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:01 crc kubenswrapper[4563]: I1124 09:05:01.993825 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:01Z","lastTransitionTime":"2025-11-24T09:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.054386 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:02 crc kubenswrapper[4563]: E1124 09:05:02.054483 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.054555 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:02 crc kubenswrapper[4563]: E1124 09:05:02.054785 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.054951 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:02 crc kubenswrapper[4563]: E1124 09:05:02.055114 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.096534 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.096575 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.096590 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.096610 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.096623 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:02Z","lastTransitionTime":"2025-11-24T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.199122 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.199166 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.199194 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.199209 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.199218 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:02Z","lastTransitionTime":"2025-11-24T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.301886 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.301929 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.301941 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.301955 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.301964 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:02Z","lastTransitionTime":"2025-11-24T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.403895 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.403960 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.403972 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.403988 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.403999 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:02Z","lastTransitionTime":"2025-11-24T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.506921 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.506976 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.506989 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.507011 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.507052 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:02Z","lastTransitionTime":"2025-11-24T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.610000 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.610103 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.610114 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.610149 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.610167 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:02Z","lastTransitionTime":"2025-11-24T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.711940 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.711976 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.711985 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.712000 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.712008 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:02Z","lastTransitionTime":"2025-11-24T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.814065 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.814109 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.814118 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.814131 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.814140 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:02Z","lastTransitionTime":"2025-11-24T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.916477 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.916512 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.916536 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.916548 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:02 crc kubenswrapper[4563]: I1124 09:05:02.916556 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:02Z","lastTransitionTime":"2025-11-24T09:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.018966 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.019075 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.019088 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.019112 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.019126 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:03Z","lastTransitionTime":"2025-11-24T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.054546 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:03 crc kubenswrapper[4563]: E1124 09:05:03.054699 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.067698 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.076199 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a72f551-f3fe-4045-933d-cfaa976bd60f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d4c2140ed99b80bf42fb32fc17368efd71c90e29906dd340bcca1a5bb5fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d718684a773c9c1c92ab5722ae89afe44cdd4485b7df17e2d2795ae64645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://265b586b95a5c747bfe4c1a82a536da07d8fded3433c572e7a73ec381565fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.084289 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.091452 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.101304 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.112129 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.120919 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.120956 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.120966 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.121006 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.121017 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:03Z","lastTransitionTime":"2025-11-24T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.123591 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.136562 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:59Z\\\",\\\"message\\\":\\\"IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1124 09:04:59.693301 6607 services_controller.go:444] Built service openshift-oauth-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1124 09:04:59.693307 6607 services_controller.go:445] Built service openshift-oauth-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nI1124 09:04:59.691434 6607 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-stlxr after 0 failed attempt(s)\\\\nI1124 09:04:59.693371 6607 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-stlxr\\\\nF1124 09:04:59.693373 6607 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.151806 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.160981 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.169889 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.178234 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.185871 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.194664 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.207098 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.215157 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381c5f62c655111b7df341bae96a5edef6bcd2d5c3a8758d07465c278445bb8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:52Z\\\",\\\"message\\\":\\\"2025-11-24T09:04:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_64d4d46d-e0a0-449c-a1b8-1ee3b6c10b3c\\\\n2025-11-24T09:04:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_64d4d46d-e0a0-449c-a1b8-1ee3b6c10b3c to /host/opt/cni/bin/\\\\n2025-11-24T09:04:07Z [verbose] multus-daemon started\\\\n2025-11-24T09:04:07Z [verbose] Readiness Indicator file check\\\\n2025-11-24T09:04:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.222289 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.222982 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.223026 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.223038 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.223053 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.223064 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:03Z","lastTransitionTime":"2025-11-24T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.231674 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:03Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.325555 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.325584 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.325611 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.325628 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.325657 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:03Z","lastTransitionTime":"2025-11-24T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.428322 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.428366 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.428377 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.428394 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.428406 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:03Z","lastTransitionTime":"2025-11-24T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.532070 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.532103 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.532114 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.532125 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.532133 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:03Z","lastTransitionTime":"2025-11-24T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.633870 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.633918 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.633929 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.633944 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.633953 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:03Z","lastTransitionTime":"2025-11-24T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.735695 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.735731 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.735740 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.735752 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.735763 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:03Z","lastTransitionTime":"2025-11-24T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.753976 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:05:03 crc kubenswrapper[4563]: E1124 09:05:03.754118 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:07.754104763 +0000 UTC m=+145.013082210 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.837959 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.838002 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.838014 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.838028 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.838038 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:03Z","lastTransitionTime":"2025-11-24T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.855251 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.855292 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.855313 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.855331 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:03 crc kubenswrapper[4563]: E1124 09:05:03.855383 4563 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:05:03 crc kubenswrapper[4563]: E1124 09:05:03.855438 4563 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:05:03 crc kubenswrapper[4563]: E1124 09:05:03.855453 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:06:07.855434095 +0000 UTC m=+145.114411552 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 24 09:05:03 crc kubenswrapper[4563]: E1124 09:05:03.855471 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-24 09:06:07.855462909 +0000 UTC m=+145.114440356 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 24 09:05:03 crc kubenswrapper[4563]: E1124 09:05:03.855554 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:05:03 crc kubenswrapper[4563]: E1124 09:05:03.855558 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 24 09:05:03 crc kubenswrapper[4563]: E1124 09:05:03.855566 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:05:03 crc kubenswrapper[4563]: E1124 09:05:03.855687 4563 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:05:03 crc kubenswrapper[4563]: E1124 09:05:03.855722 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-24 09:06:07.855710826 +0000 UTC m=+145.114688274 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:05:03 crc kubenswrapper[4563]: E1124 09:05:03.856321 4563 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 24 09:05:03 crc kubenswrapper[4563]: E1124 09:05:03.856344 4563 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:05:03 crc kubenswrapper[4563]: E1124 09:05:03.856393 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-24 09:06:07.856379287 +0000 UTC m=+145.115356734 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.939755 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.939813 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.939828 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.939845 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:03 crc kubenswrapper[4563]: I1124 09:05:03.940099 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:03Z","lastTransitionTime":"2025-11-24T09:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.042580 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.042627 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.042652 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.042668 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.042681 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.053788 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.053818 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.053858 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:04 crc kubenswrapper[4563]: E1124 09:05:04.053956 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:04 crc kubenswrapper[4563]: E1124 09:05:04.054101 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:04 crc kubenswrapper[4563]: E1124 09:05:04.054188 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.110478 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.110552 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.110567 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.110602 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.110613 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: E1124 09:05:04.122223 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:04Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.125634 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.125676 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.125685 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.125698 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.125707 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: E1124 09:05:04.134389 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:04Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.137012 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.137040 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.137049 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.137059 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.137066 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: E1124 09:05:04.146024 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:04Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.149208 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.149236 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.149245 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.149254 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.149261 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: E1124 09:05:04.158731 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:04Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.161852 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.161890 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.161898 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.161907 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.161915 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: E1124 09:05:04.170986 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:04Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:04 crc kubenswrapper[4563]: E1124 09:05:04.171086 4563 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.172186 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.172237 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.172247 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.172261 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.172269 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.274220 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.274288 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.274299 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.274315 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.274327 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.376267 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.376320 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.376329 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.376339 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.376346 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.477993 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.478021 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.478030 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.478041 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.478050 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.580397 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.580436 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.580462 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.580474 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.580481 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.682729 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.682774 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.682783 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.682793 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.682801 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.784908 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.784967 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.784977 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.784993 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.785003 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.886842 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.886862 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.886871 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.886881 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.886887 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.989471 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.989516 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.989536 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.989556 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:04 crc kubenswrapper[4563]: I1124 09:05:04.989569 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:04Z","lastTransitionTime":"2025-11-24T09:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.054199 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:05 crc kubenswrapper[4563]: E1124 09:05:05.054351 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.091616 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.091676 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.091686 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.091696 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.091705 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:05Z","lastTransitionTime":"2025-11-24T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.194015 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.194073 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.194083 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.194103 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.194116 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:05Z","lastTransitionTime":"2025-11-24T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.296100 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.296140 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.296151 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.296166 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.296176 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:05Z","lastTransitionTime":"2025-11-24T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.397899 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.397942 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.397967 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.397983 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.397993 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:05Z","lastTransitionTime":"2025-11-24T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.499894 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.499931 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.499941 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.499960 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.499968 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:05Z","lastTransitionTime":"2025-11-24T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.602126 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.602192 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.602203 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.602235 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.602246 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:05Z","lastTransitionTime":"2025-11-24T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.704503 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.704546 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.704558 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.704572 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.704583 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:05Z","lastTransitionTime":"2025-11-24T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.806858 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.806911 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.806921 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.806935 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.806945 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:05Z","lastTransitionTime":"2025-11-24T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.909048 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.909323 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.909396 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.909463 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:05 crc kubenswrapper[4563]: I1124 09:05:05.909534 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:05Z","lastTransitionTime":"2025-11-24T09:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.011749 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.012003 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.012074 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.012135 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.012187 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:06Z","lastTransitionTime":"2025-11-24T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.053953 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:06 crc kubenswrapper[4563]: E1124 09:05:06.054165 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.053962 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:06 crc kubenswrapper[4563]: E1124 09:05:06.054387 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.053952 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:06 crc kubenswrapper[4563]: E1124 09:05:06.054589 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.114875 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.115073 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.115148 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.115215 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.115277 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:06Z","lastTransitionTime":"2025-11-24T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.216994 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.217039 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.217051 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.217064 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.217073 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:06Z","lastTransitionTime":"2025-11-24T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.318843 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.318873 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.318881 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.318894 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.318903 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:06Z","lastTransitionTime":"2025-11-24T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.420986 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.421032 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.421041 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.421054 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.421063 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:06Z","lastTransitionTime":"2025-11-24T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.522815 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.522848 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.522857 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.522871 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.522879 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:06Z","lastTransitionTime":"2025-11-24T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.625228 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.625258 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.625266 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.625278 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.625285 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:06Z","lastTransitionTime":"2025-11-24T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.727518 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.727580 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.727588 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.727612 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.727621 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:06Z","lastTransitionTime":"2025-11-24T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.829142 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.829325 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.829334 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.829347 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.829359 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:06Z","lastTransitionTime":"2025-11-24T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.931038 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.931066 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.931074 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.931084 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:06 crc kubenswrapper[4563]: I1124 09:05:06.931092 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:06Z","lastTransitionTime":"2025-11-24T09:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.032745 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.032784 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.032794 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.032807 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.032816 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:07Z","lastTransitionTime":"2025-11-24T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.054162 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:07 crc kubenswrapper[4563]: E1124 09:05:07.054286 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.134830 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.134874 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.134884 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.134902 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.134915 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:07Z","lastTransitionTime":"2025-11-24T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.238170 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.238224 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.238238 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.238254 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.238265 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:07Z","lastTransitionTime":"2025-11-24T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.340224 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.340277 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.340288 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.340301 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.340310 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:07Z","lastTransitionTime":"2025-11-24T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.442847 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.442880 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.442887 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.442899 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.442907 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:07Z","lastTransitionTime":"2025-11-24T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.545091 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.545122 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.545130 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.545141 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.545149 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:07Z","lastTransitionTime":"2025-11-24T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.647394 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.647424 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.647433 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.647447 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.647456 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:07Z","lastTransitionTime":"2025-11-24T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.749592 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.749627 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.749652 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.749665 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.749675 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:07Z","lastTransitionTime":"2025-11-24T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.851177 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.851215 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.851223 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.851235 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.851244 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:07Z","lastTransitionTime":"2025-11-24T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.952970 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.953008 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.953016 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.953030 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:07 crc kubenswrapper[4563]: I1124 09:05:07.953040 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:07Z","lastTransitionTime":"2025-11-24T09:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.054217 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.054267 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.054282 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:08 crc kubenswrapper[4563]: E1124 09:05:08.054340 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:08 crc kubenswrapper[4563]: E1124 09:05:08.054466 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:08 crc kubenswrapper[4563]: E1124 09:05:08.054553 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.055490 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.055518 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.055552 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.055565 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.055573 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:08Z","lastTransitionTime":"2025-11-24T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.157495 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.157549 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.157558 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.157572 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.157582 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:08Z","lastTransitionTime":"2025-11-24T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.259733 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.259764 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.259773 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.259784 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.259792 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:08Z","lastTransitionTime":"2025-11-24T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.361489 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.361535 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.361545 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.361558 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.361567 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:08Z","lastTransitionTime":"2025-11-24T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.463474 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.463514 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.463537 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.463551 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.463560 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:08Z","lastTransitionTime":"2025-11-24T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.565777 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.565811 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.565819 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.565831 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.565839 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:08Z","lastTransitionTime":"2025-11-24T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.668932 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.668968 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.668978 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.668991 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.669000 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:08Z","lastTransitionTime":"2025-11-24T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.771165 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.771196 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.771205 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.771215 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.771222 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:08Z","lastTransitionTime":"2025-11-24T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.872824 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.872864 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.872872 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.872883 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.872893 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:08Z","lastTransitionTime":"2025-11-24T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.974861 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.974891 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.974899 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.974909 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:08 crc kubenswrapper[4563]: I1124 09:05:08.974917 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:08Z","lastTransitionTime":"2025-11-24T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.053779 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:09 crc kubenswrapper[4563]: E1124 09:05:09.053875 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.077195 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.077222 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.077230 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.077242 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.077250 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:09Z","lastTransitionTime":"2025-11-24T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.179522 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.179552 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.179560 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.179574 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.179583 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:09Z","lastTransitionTime":"2025-11-24T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.281800 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.281837 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.281847 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.281860 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.281871 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:09Z","lastTransitionTime":"2025-11-24T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.383935 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.383964 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.383974 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.383994 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.384007 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:09Z","lastTransitionTime":"2025-11-24T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.485951 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.486183 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.486209 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.486223 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.486236 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:09Z","lastTransitionTime":"2025-11-24T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.587787 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.587812 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.587822 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.587832 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.587840 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:09Z","lastTransitionTime":"2025-11-24T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.689934 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.689961 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.689970 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.689980 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.689988 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:09Z","lastTransitionTime":"2025-11-24T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.791748 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.791911 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.791928 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.791939 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.791946 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:09Z","lastTransitionTime":"2025-11-24T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.893925 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.893972 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.893983 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.893997 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.894006 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:09Z","lastTransitionTime":"2025-11-24T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.996228 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.996264 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.996274 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.996286 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:09 crc kubenswrapper[4563]: I1124 09:05:09.996295 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:09Z","lastTransitionTime":"2025-11-24T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.053828 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:10 crc kubenswrapper[4563]: E1124 09:05:10.053901 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.053949 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.053987 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:10 crc kubenswrapper[4563]: E1124 09:05:10.054125 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:10 crc kubenswrapper[4563]: E1124 09:05:10.054232 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.098042 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.098067 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.098076 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.098085 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.098093 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:10Z","lastTransitionTime":"2025-11-24T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.200352 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.200381 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.200390 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.200405 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.200414 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:10Z","lastTransitionTime":"2025-11-24T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.306413 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.306496 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.306509 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.306549 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.306563 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:10Z","lastTransitionTime":"2025-11-24T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.408908 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.409329 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.409404 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.409475 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.409560 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:10Z","lastTransitionTime":"2025-11-24T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.511500 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.511550 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.511559 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.511573 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.511585 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:10Z","lastTransitionTime":"2025-11-24T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.614068 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.614130 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.614143 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.614161 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.614173 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:10Z","lastTransitionTime":"2025-11-24T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.716651 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.716701 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.716711 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.716731 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.716740 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:10Z","lastTransitionTime":"2025-11-24T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.820149 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.820203 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.820214 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.820232 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.820242 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:10Z","lastTransitionTime":"2025-11-24T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.923666 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.923720 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.923731 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.923755 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:10 crc kubenswrapper[4563]: I1124 09:05:10.923764 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:10Z","lastTransitionTime":"2025-11-24T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.026864 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.026920 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.026934 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.026957 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.026968 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:11Z","lastTransitionTime":"2025-11-24T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.053888 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:11 crc kubenswrapper[4563]: E1124 09:05:11.054052 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.129977 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.130031 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.130042 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.130059 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.130072 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:11Z","lastTransitionTime":"2025-11-24T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.232961 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.233004 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.233015 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.233032 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.233041 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:11Z","lastTransitionTime":"2025-11-24T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.335695 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.335733 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.335745 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.335761 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.335771 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:11Z","lastTransitionTime":"2025-11-24T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.437607 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.437656 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.437677 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.437690 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.437699 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:11Z","lastTransitionTime":"2025-11-24T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.539620 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.539675 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.539685 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.539697 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.539707 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:11Z","lastTransitionTime":"2025-11-24T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.642319 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.642378 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.642389 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.642411 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.642424 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:11Z","lastTransitionTime":"2025-11-24T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.744179 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.744210 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.744218 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.744249 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.744261 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:11Z","lastTransitionTime":"2025-11-24T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.846326 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.846374 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.846383 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.846399 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.846409 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:11Z","lastTransitionTime":"2025-11-24T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.948691 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.948720 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.948730 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.948742 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:11 crc kubenswrapper[4563]: I1124 09:05:11.948750 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:11Z","lastTransitionTime":"2025-11-24T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.050977 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.051008 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.051019 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.051049 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.051057 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:12Z","lastTransitionTime":"2025-11-24T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.054359 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:12 crc kubenswrapper[4563]: E1124 09:05:12.054500 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.054513 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.054535 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:12 crc kubenswrapper[4563]: E1124 09:05:12.054904 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:12 crc kubenswrapper[4563]: E1124 09:05:12.055414 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.061837 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.153506 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.153555 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.153566 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.153577 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.153587 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:12Z","lastTransitionTime":"2025-11-24T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.255581 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.255635 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.255660 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.255678 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.255689 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:12Z","lastTransitionTime":"2025-11-24T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.358106 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.358138 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.358147 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.358156 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.358164 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:12Z","lastTransitionTime":"2025-11-24T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.460405 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.460427 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.460438 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.460450 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.460458 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:12Z","lastTransitionTime":"2025-11-24T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.561965 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.561999 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.562011 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.562026 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.562035 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:12Z","lastTransitionTime":"2025-11-24T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.663559 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.663600 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.663611 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.663626 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.663652 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:12Z","lastTransitionTime":"2025-11-24T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.766289 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.766324 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.766333 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.766351 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.766358 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:12Z","lastTransitionTime":"2025-11-24T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.868126 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.868171 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.868180 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.868192 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.868202 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:12Z","lastTransitionTime":"2025-11-24T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.969720 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.969751 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.969759 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.969769 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:12 crc kubenswrapper[4563]: I1124 09:05:12.969778 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:12Z","lastTransitionTime":"2025-11-24T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.054062 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:13 crc kubenswrapper[4563]: E1124 09:05:13.054173 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.066834 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.071033 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.071066 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.071076 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.071090 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.071101 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:13Z","lastTransitionTime":"2025-11-24T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.076988 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7jjh2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"223c299f-bbf0-4b77-9792-045c08cbfb0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6137627b7eb156dc8a423610b2cd4bcc7c4f37de296b039109e9bfe9282cea22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6clq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7jjh2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.085851 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nw8xd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019bd805-9123-494a-bb29-f39b924e6243\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381c5f62c655111b7df341bae96a5edef6bcd2d5c3a8758d07465c278445bb8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:52Z\\\",\\\"message\\\":\\\"2025-11-24T09:04:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_64d4d46d-e0a0-449c-a1b8-1ee3b6c10b3c\\\\n2025-11-24T09:04:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_64d4d46d-e0a0-449c-a1b8-1ee3b6c10b3c to /host/opt/cni/bin/\\\\n2025-11-24T09:04:07Z [verbose] multus-daemon started\\\\n2025-11-24T09:04:07Z [verbose] Readiness Indicator file check\\\\n2025-11-24T09:04:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f8nv5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nw8xd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.092863 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b2bfe55-8989-49b3-bb61-e28189447627\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc8036d6ff7c488d1680e95892f24ecec7071467fec139b8cc24a39fbb4379b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm8wf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stlxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.101987 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7qphz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf08273-4b03-4e6f-8e52-d968b8c98f99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711f1b2e1de3f2bb8bdc6bf1268e8ef7833d05f31b700a7ed752c73b450de3b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://460c1e3448c0b715d4c1450cbce5e9ed54fb73c3174c2f33aa41bcc8b80e2175\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abe7797124000f00ed83ec5f43bf861f9cf2dc459c0bfe9864d18e22c846a7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fce3bff3372661f5adbdbf710995c696aedd7ae609150cfb7f606fc2816e82e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d937c958b05efb97530b46e331a5772a1cd1a43c4bd1789807d0e8896f06751a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcb7e0bb1ffe5c5d8f49de2a173bf65ee6054706f12cee2f8690e3a1fc914e89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcd3dd662a9fb99d4cf54fa5f88bd1301451d628dd6c0533e0b2a6c61240a0b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-np6f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7qphz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.109096 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b9c54737-f104-46e9-86b9-0e9ce7915e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f1234c57f8afc7f471626ad72c5c5c9bba242e98b468a7554585ac1e0c1aa7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0090fc7a8b837592f800c6aadc864f6fd9aa175561da4bb7cdf19c8e174884d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zbzzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5ssmn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.116504 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4dmd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bsfsd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.128473 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d1b848-9915-4f4a-a147-f4bf6d63472c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a659a09916ec91aba128925d190a842ddcb0f02fb222f63cf4a839b7a888b55b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b061df97e5b127527b35093b7f6b793c3f0016a023677b77a7d23b9916c2b354\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3794069a872ce995057cc10851283e3700931a3dbbe96c33ef9b8fc0d9ae2c57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16ca3745d62fdc7e542bbc8bd2dad4f9e5a8314522ea7eaa29a65697923ff4b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dac1a1bd1ac9ec48519e1adb8f6d6c9bb4514b0199e89cb072109ed1579e2ed\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-24T09:03:59Z\\\",\\\"message\\\":\\\":45 +0000 UTC (now=2025-11-24 09:03:59.801002918 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801128 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1763975039\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1763975039\\\\\\\\\\\\\\\" (2025-11-24 08:03:59 +0000 UTC to 2026-11-24 08:03:59 +0000 UTC (now=2025-11-24 09:03:59.801113377 +0000 UTC))\\\\\\\"\\\\nI1124 09:03:59.801146 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1124 09:03:59.801163 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1124 09:03:59.801211 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801222 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801233 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1124 09:03:59.801239 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1124 09:03:59.801237 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2484665369/tls.crt::/tmp/serving-cert-2484665369/tls.key\\\\\\\"\\\\nI1124 09:03:59.801273 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1124 09:03:59.801287 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1124 09:03:59.801308 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1124 09:03:59.801315 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nF1124 09:03:59.801390 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf4607df4d4c4f1e475cfddb61dc85484b6350b3ebdf4ed8b88b8a4d89849893\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c499fa48a06f0178931aa39f083615bd17e350c2535e35646b41dd18dea42a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.139406 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a72f551-f3fe-4045-933d-cfaa976bd60f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76d4c2140ed99b80bf42fb32fc17368efd71c90e29906dd340bcca1a5bb5fabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d718684a773c9c1c92ab5722ae89afe44cdd4485b7df17e2d2795ae64645df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://265b586b95a5c747bfe4c1a82a536da07d8fded3433c572e7a73ec381565fb6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c2f690f06390616da3ae5e4c6670793559f05b6bbb4a8441cd23db81b863ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.148035 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://baaf6ee29eaf56d7b7cde733ef7452391cc80252d2e67b697661dfa5aa3a5f49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.154835 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"667b3c47-2078-4d73-960f-21925bf52282\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de9057e94769992c75e6445a5c816164c64ed64f9dd2b07e8b317a7e17654f77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c71f1a45af95e970bf803b18f56400db645e493ba0aca4162bf68c018731155d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c71f1a45af95e970bf803b18f56400db645e493ba0aca4162bf68c018731155d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.163075 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e951e931944675245a00283a28efcbed7efdd20c72fca673972f5f8f2f8c956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.171067 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6bc344ceb17bdffa46ef5991e986f292e15dcfcfa9c0f2d4a098d16a3228bb60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be19918b679a217c3e4ab0eecb079bc9ecba57d7f9005251b588f7051881a293\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.172462 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.172498 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.172510 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.172526 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.172546 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:13Z","lastTransitionTime":"2025-11-24T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.179919 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.192572 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cee9b713-10b0-49a5-841d-fbb083faba9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-24T09:04:59Z\\\",\\\"message\\\":\\\"IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1124 09:04:59.693301 6607 services_controller.go:444] Built service openshift-oauth-apiserver/api LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1124 09:04:59.693307 6607 services_controller.go:445] Built service openshift-oauth-apiserver/api LB template configs for network=default: []services.lbConfig(nil)\\\\nI1124 09:04:59.691434 6607 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-stlxr after 0 failed attempt(s)\\\\nI1124 09:04:59.693371 6607 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-stlxr\\\\nF1124 09:04:59.693373 6607 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5d62m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vgbgr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.199566 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l4cg2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8d6b29f6-3a4d-408e-b6fc-9f8ded8787aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f8b7c51d44b5fb45c9f7142ebab598e5bbb2aa302fb70cdf67680e9f2d96ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ffx6c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:04:09Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l4cg2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.219905 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0a5b3d9-135a-4bd3-acd5-6ab17c46427e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6066408143a7d61b6822e013bb7cd364fef09796ca62c72dd3677e46ab15b941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25416a31f5b408d120056f682adaa50188e2e505d1f462fff5dc51579fc03169\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357aa577578d35b32d4f52f00096841a9eb807c175dd7d924cf2b062e322c8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1ae0e2a1db4c1edb16ffdc190ff1034983eb79387711858ad0636eebe052ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d7985e2805a33de87e7554358419fd1185b278adc9fca93b4c6b1765456b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d24e519b464a1a81016c2a76913d09fc24fe1536716e2dc3fef19bc5c39daa59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4abace48a4134ae610a97734c8d130e43eff3bb19a546ab05d7b05661f9f3b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a668e51b814f46d87a2b103be88b9589d578cef5ced56c56a500a18035b652\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-24T09:03:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.238238 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c871ce83-da0c-480e-81de-278a3b05acb0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-24T09:03:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://209af2c7112b5dee09d1cf64dcbac0a78f617e294b10784ee9c337503e1cd2e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b79724f9e9680f951fb7bbf08e4b8c32f95f88a1eed7b0231e32a110748c6ec9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3994262ce6d983458a66ab217bb9fb36f9752d9b6c406578649924305fd3ce8e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c0ecce00a5801d234c3dd6b29e3aed4622cf2972afa55e4b5d6cf8676144d22\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-24T09:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-24T09:03:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.253667 4563 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-24T09:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:13Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.275221 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.275258 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.275268 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.275283 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.275294 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:13Z","lastTransitionTime":"2025-11-24T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.377083 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.377146 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.377158 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.377184 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.377203 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:13Z","lastTransitionTime":"2025-11-24T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.479508 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.479592 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.479615 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.479665 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.479682 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:13Z","lastTransitionTime":"2025-11-24T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.581692 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.581757 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.581768 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.581781 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.581811 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:13Z","lastTransitionTime":"2025-11-24T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.684345 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.684388 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.684400 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.684419 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.684430 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:13Z","lastTransitionTime":"2025-11-24T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.786908 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.786951 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.786960 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.786977 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.786988 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:13Z","lastTransitionTime":"2025-11-24T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.888955 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.889006 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.889016 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.889032 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.889042 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:13Z","lastTransitionTime":"2025-11-24T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.990738 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.990783 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.990793 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.990807 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:13 crc kubenswrapper[4563]: I1124 09:05:13.990818 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:13Z","lastTransitionTime":"2025-11-24T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.053742 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.053776 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:14 crc kubenswrapper[4563]: E1124 09:05:14.053849 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:14 crc kubenswrapper[4563]: E1124 09:05:14.053961 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.054001 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:14 crc kubenswrapper[4563]: E1124 09:05:14.054082 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.093393 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.093450 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.093460 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.093473 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.093483 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:14Z","lastTransitionTime":"2025-11-24T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.195676 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.195722 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.195732 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.195746 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.195757 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:14Z","lastTransitionTime":"2025-11-24T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.297874 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.297911 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.297920 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.297935 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.297945 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:14Z","lastTransitionTime":"2025-11-24T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.399749 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.399805 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.399813 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.399827 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.399835 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:14Z","lastTransitionTime":"2025-11-24T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.473216 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.473248 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.473260 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.473274 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.473284 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:14Z","lastTransitionTime":"2025-11-24T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:14 crc kubenswrapper[4563]: E1124 09:05:14.482773 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.485484 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.485525 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.485549 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.485566 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.485575 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:14Z","lastTransitionTime":"2025-11-24T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:14 crc kubenswrapper[4563]: E1124 09:05:14.494189 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.496655 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.496693 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.496702 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.496715 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.496725 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:14Z","lastTransitionTime":"2025-11-24T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:14 crc kubenswrapper[4563]: E1124 09:05:14.505081 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.507499 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.507526 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.507546 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.507574 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.507582 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:14Z","lastTransitionTime":"2025-11-24T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:14 crc kubenswrapper[4563]: E1124 09:05:14.515881 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.518617 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.518685 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.518694 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.518709 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.518719 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:14Z","lastTransitionTime":"2025-11-24T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:14 crc kubenswrapper[4563]: E1124 09:05:14.527115 4563 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-24T09:05:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f656a478-9d9a-4ffb-98be-bf6c1dcaa83e\\\",\\\"systemUUID\\\":\\\"3210c6ca-f708-448c-9ff2-b003edce1c8c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-24T09:05:14Z is after 2025-08-24T17:21:41Z" Nov 24 09:05:14 crc kubenswrapper[4563]: E1124 09:05:14.527217 4563 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.528248 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.528275 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.528288 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.528302 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.528313 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:14Z","lastTransitionTime":"2025-11-24T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.630591 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.630669 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.630678 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.630692 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.630703 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:14Z","lastTransitionTime":"2025-11-24T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.732823 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.732858 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.732867 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.732882 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.732890 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:14Z","lastTransitionTime":"2025-11-24T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.835219 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.835260 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.835285 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.835299 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.835309 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:14Z","lastTransitionTime":"2025-11-24T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.936982 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.937010 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.937018 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.937030 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:14 crc kubenswrapper[4563]: I1124 09:05:14.937038 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:14Z","lastTransitionTime":"2025-11-24T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.039822 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.039858 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.039867 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.039880 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.039888 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:15Z","lastTransitionTime":"2025-11-24T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.054116 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:15 crc kubenswrapper[4563]: E1124 09:05:15.054218 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.142201 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.142232 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.142262 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.142277 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.142285 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:15Z","lastTransitionTime":"2025-11-24T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.244578 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.244612 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.244620 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.244634 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.244661 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:15Z","lastTransitionTime":"2025-11-24T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.346992 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.347028 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.347038 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.347055 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.347065 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:15Z","lastTransitionTime":"2025-11-24T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.449049 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.449088 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.449097 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.449110 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.449118 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:15Z","lastTransitionTime":"2025-11-24T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.551470 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.551514 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.551523 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.551554 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.551563 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:15Z","lastTransitionTime":"2025-11-24T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.653459 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.653505 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.653514 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.653548 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.653559 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:15Z","lastTransitionTime":"2025-11-24T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.755571 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.755607 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.755616 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.755629 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.755658 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:15Z","lastTransitionTime":"2025-11-24T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.858799 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.858846 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.858856 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.858871 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.858879 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:15Z","lastTransitionTime":"2025-11-24T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.961205 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.961249 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.961259 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.961274 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:15 crc kubenswrapper[4563]: I1124 09:05:15.961284 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:15Z","lastTransitionTime":"2025-11-24T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.054102 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.054460 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.054478 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:16 crc kubenswrapper[4563]: E1124 09:05:16.054586 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:16 crc kubenswrapper[4563]: E1124 09:05:16.054685 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.054807 4563 scope.go:117] "RemoveContainer" containerID="d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772" Nov 24 09:05:16 crc kubenswrapper[4563]: E1124 09:05:16.054977 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:16 crc kubenswrapper[4563]: E1124 09:05:16.055005 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.063352 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.063382 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.063392 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.063405 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.063414 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:16Z","lastTransitionTime":"2025-11-24T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.165408 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.165441 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.165449 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.165461 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.165469 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:16Z","lastTransitionTime":"2025-11-24T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.267665 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.267693 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.267702 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.267714 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.267722 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:16Z","lastTransitionTime":"2025-11-24T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.370140 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.370179 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.370191 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.370204 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.370211 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:16Z","lastTransitionTime":"2025-11-24T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.471982 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.472227 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.472314 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.472393 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.472451 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:16Z","lastTransitionTime":"2025-11-24T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.574080 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.574120 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.574128 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.574141 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.574150 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:16Z","lastTransitionTime":"2025-11-24T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.676361 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.676397 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.676407 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.676420 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.676428 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:16Z","lastTransitionTime":"2025-11-24T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.778232 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.778268 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.778277 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.778289 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.778298 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:16Z","lastTransitionTime":"2025-11-24T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.880940 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.881160 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.881171 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.881187 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.881195 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:16Z","lastTransitionTime":"2025-11-24T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.982865 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.982902 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.982911 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.982926 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:16 crc kubenswrapper[4563]: I1124 09:05:16.982935 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:16Z","lastTransitionTime":"2025-11-24T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.054625 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:17 crc kubenswrapper[4563]: E1124 09:05:17.055033 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.085152 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.085179 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.085188 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.085198 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.085206 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:17Z","lastTransitionTime":"2025-11-24T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.187544 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.187680 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.187761 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.187822 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.187884 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:17Z","lastTransitionTime":"2025-11-24T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.289753 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.289789 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.289798 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.289813 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.289822 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:17Z","lastTransitionTime":"2025-11-24T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.392053 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.392188 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.392268 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.392329 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.392403 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:17Z","lastTransitionTime":"2025-11-24T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.494838 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.494877 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.494886 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.494898 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.494906 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:17Z","lastTransitionTime":"2025-11-24T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.597277 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.597328 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.597338 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.597351 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.597360 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:17Z","lastTransitionTime":"2025-11-24T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.698792 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.698818 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.698827 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.698837 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.698844 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:17Z","lastTransitionTime":"2025-11-24T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.801076 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.801104 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.801112 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.801123 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.801132 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:17Z","lastTransitionTime":"2025-11-24T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.902783 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.902950 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.903037 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.903106 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:17 crc kubenswrapper[4563]: I1124 09:05:17.903171 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:17Z","lastTransitionTime":"2025-11-24T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.005035 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.005063 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.005074 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.005087 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.005095 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:18Z","lastTransitionTime":"2025-11-24T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.053568 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.053609 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:18 crc kubenswrapper[4563]: E1124 09:05:18.053680 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:18 crc kubenswrapper[4563]: E1124 09:05:18.053752 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.053761 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:18 crc kubenswrapper[4563]: E1124 09:05:18.053847 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.107212 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.107240 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.107250 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.107261 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.107271 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:18Z","lastTransitionTime":"2025-11-24T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.209165 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.209205 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.209215 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.209230 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.209242 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:18Z","lastTransitionTime":"2025-11-24T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.311104 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.311135 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.311161 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.311172 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.311181 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:18Z","lastTransitionTime":"2025-11-24T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.412883 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.412916 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.412924 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.412935 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.412944 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:18Z","lastTransitionTime":"2025-11-24T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.514784 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.514808 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.514818 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.514828 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.514842 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:18Z","lastTransitionTime":"2025-11-24T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.616589 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.616607 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.616614 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.616623 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.616630 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:18Z","lastTransitionTime":"2025-11-24T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.718738 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.718838 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.718908 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.718974 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.719023 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:18Z","lastTransitionTime":"2025-11-24T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.821356 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.821446 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.821517 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.821592 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.821665 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:18Z","lastTransitionTime":"2025-11-24T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.923221 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.923263 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.923272 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.923287 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:18 crc kubenswrapper[4563]: I1124 09:05:18.923304 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:18Z","lastTransitionTime":"2025-11-24T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.025453 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.025499 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.025511 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.025527 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.025548 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:19Z","lastTransitionTime":"2025-11-24T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.054074 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:19 crc kubenswrapper[4563]: E1124 09:05:19.054208 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.127276 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.127303 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.127311 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.127320 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.127328 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:19Z","lastTransitionTime":"2025-11-24T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.229006 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.229039 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.229048 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.229058 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.229068 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:19Z","lastTransitionTime":"2025-11-24T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.331115 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.331151 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.331159 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.331319 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.331328 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:19Z","lastTransitionTime":"2025-11-24T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.432706 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.432728 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.432737 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.432745 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.432752 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:19Z","lastTransitionTime":"2025-11-24T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.534240 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.534276 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.534286 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.534300 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.534311 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:19Z","lastTransitionTime":"2025-11-24T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.636575 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.636621 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.636633 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.636679 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.636689 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:19Z","lastTransitionTime":"2025-11-24T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.738550 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.738577 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.738607 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.738619 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.738627 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:19Z","lastTransitionTime":"2025-11-24T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.840782 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.840844 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.840854 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.840878 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.840894 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:19Z","lastTransitionTime":"2025-11-24T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.942729 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.942774 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.942785 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.942799 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:19 crc kubenswrapper[4563]: I1124 09:05:19.942810 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:19Z","lastTransitionTime":"2025-11-24T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.044965 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.044999 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.045011 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.045026 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.045039 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:20Z","lastTransitionTime":"2025-11-24T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.054403 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.054461 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.054473 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:20 crc kubenswrapper[4563]: E1124 09:05:20.054515 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:20 crc kubenswrapper[4563]: E1124 09:05:20.054601 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:20 crc kubenswrapper[4563]: E1124 09:05:20.054747 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.146970 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.146999 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.147008 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.147020 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.147028 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:20Z","lastTransitionTime":"2025-11-24T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.249292 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.249331 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.249341 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.249359 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.249374 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:20Z","lastTransitionTime":"2025-11-24T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.352077 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.352123 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.352133 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.352149 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.352163 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:20Z","lastTransitionTime":"2025-11-24T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.454303 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.454350 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.454360 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.454378 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.454390 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:20Z","lastTransitionTime":"2025-11-24T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.556175 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.556213 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.556225 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.556241 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.556251 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:20Z","lastTransitionTime":"2025-11-24T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.659258 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.659311 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.659322 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.659340 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.659352 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:20Z","lastTransitionTime":"2025-11-24T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.761448 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.761504 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.761520 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.761560 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.761578 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:20Z","lastTransitionTime":"2025-11-24T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.864118 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.864153 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.864161 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.864175 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.864184 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:20Z","lastTransitionTime":"2025-11-24T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.966509 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.966557 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.966566 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.966583 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:20 crc kubenswrapper[4563]: I1124 09:05:20.966593 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:20Z","lastTransitionTime":"2025-11-24T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.053992 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:21 crc kubenswrapper[4563]: E1124 09:05:21.054122 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.068137 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.068278 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.068381 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.068470 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.068599 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:21Z","lastTransitionTime":"2025-11-24T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.171369 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.171545 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.171611 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.171693 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.171776 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:21Z","lastTransitionTime":"2025-11-24T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.274422 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.274458 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.274467 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.274480 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.274491 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:21Z","lastTransitionTime":"2025-11-24T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.376738 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.376775 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.376785 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.376799 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.376810 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:21Z","lastTransitionTime":"2025-11-24T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.479175 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.479213 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.479221 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.479234 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.479244 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:21Z","lastTransitionTime":"2025-11-24T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.582180 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.582223 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.582232 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.582245 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.582254 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:21Z","lastTransitionTime":"2025-11-24T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.684220 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.684274 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.684283 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.684298 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.684309 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:21Z","lastTransitionTime":"2025-11-24T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.785915 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.785952 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.785962 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.785979 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.785990 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:21Z","lastTransitionTime":"2025-11-24T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.887699 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.887744 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.887754 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.887772 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.887787 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:21Z","lastTransitionTime":"2025-11-24T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.989188 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.989223 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.989234 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.989250 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:21 crc kubenswrapper[4563]: I1124 09:05:21.989262 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:21Z","lastTransitionTime":"2025-11-24T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.054203 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.054239 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.054382 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:22 crc kubenswrapper[4563]: E1124 09:05:22.054526 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:22 crc kubenswrapper[4563]: E1124 09:05:22.054595 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:22 crc kubenswrapper[4563]: E1124 09:05:22.054684 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.091030 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.091057 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.091065 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.091075 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.091085 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:22Z","lastTransitionTime":"2025-11-24T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.193469 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.193499 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.193508 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.193519 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.193530 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:22Z","lastTransitionTime":"2025-11-24T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.295154 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.295187 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.295196 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.295209 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.295217 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:22Z","lastTransitionTime":"2025-11-24T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.397552 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.397583 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.397591 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.397600 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.397608 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:22Z","lastTransitionTime":"2025-11-24T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.499390 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.499417 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.499426 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.499437 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.499447 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:22Z","lastTransitionTime":"2025-11-24T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.601577 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.601612 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.601621 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.601635 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.601663 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:22Z","lastTransitionTime":"2025-11-24T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.703219 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.703255 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.703263 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.703278 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.703287 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:22Z","lastTransitionTime":"2025-11-24T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.805362 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.805391 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.805400 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.805410 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.805418 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:22Z","lastTransitionTime":"2025-11-24T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.907829 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.907866 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.907875 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.907888 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:22 crc kubenswrapper[4563]: I1124 09:05:22.907897 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:22Z","lastTransitionTime":"2025-11-24T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.010025 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.010065 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.010074 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.010086 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.010095 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:23Z","lastTransitionTime":"2025-11-24T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.054689 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:23 crc kubenswrapper[4563]: E1124 09:05:23.054797 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.067109 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=83.067100336 podStartE2EDuration="1m23.067100336s" podCreationTimestamp="2025-11-24 09:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:05:23.066896912 +0000 UTC m=+100.325874359" watchObservedRunningTime="2025-11-24 09:05:23.067100336 +0000 UTC m=+100.326077783" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.082612 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=53.082598022 podStartE2EDuration="53.082598022s" podCreationTimestamp="2025-11-24 09:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:05:23.075006356 +0000 UTC m=+100.333983803" watchObservedRunningTime="2025-11-24 09:05:23.082598022 +0000 UTC m=+100.341575469" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.099594 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-l4cg2" podStartSLOduration=78.099581751 podStartE2EDuration="1m18.099581751s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:05:23.092742643 +0000 UTC m=+100.351720090" watchObservedRunningTime="2025-11-24 09:05:23.099581751 +0000 UTC m=+100.358559198" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.111696 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=11.111679756 podStartE2EDuration="11.111679756s" podCreationTimestamp="2025-11-24 09:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:05:23.099633199 +0000 UTC m=+100.358610645" watchObservedRunningTime="2025-11-24 09:05:23.111679756 +0000 UTC m=+100.370657202" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.112562 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.112591 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.112599 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.112610 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.112619 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:23Z","lastTransitionTime":"2025-11-24T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.129110 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs\") pod \"network-metrics-daemon-bsfsd\" (UID: \"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\") " pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:23 crc kubenswrapper[4563]: E1124 09:05:23.129207 4563 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:05:23 crc kubenswrapper[4563]: E1124 09:05:23.129260 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs podName:4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0 nodeName:}" failed. No retries permitted until 2025-11-24 09:06:27.129245992 +0000 UTC m=+164.388223440 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs") pod "network-metrics-daemon-bsfsd" (UID: "4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.167486 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=82.167465459 podStartE2EDuration="1m22.167465459s" podCreationTimestamp="2025-11-24 09:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:05:23.166370564 +0000 UTC m=+100.425348011" watchObservedRunningTime="2025-11-24 09:05:23.167465459 +0000 UTC m=+100.426442906" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.175784 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=82.175768206 podStartE2EDuration="1m22.175768206s" podCreationTimestamp="2025-11-24 09:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:05:23.17570689 +0000 UTC m=+100.434684338" watchObservedRunningTime="2025-11-24 09:05:23.175768206 +0000 UTC m=+100.434745654" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.197834 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5ssmn" podStartSLOduration=78.197814181 podStartE2EDuration="1m18.197814181s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:05:23.190971649 +0000 UTC m=+100.449949095" watchObservedRunningTime="2025-11-24 09:05:23.197814181 +0000 UTC m=+100.456791629" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.214225 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.214235 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7jjh2" podStartSLOduration=78.21421955 podStartE2EDuration="1m18.21421955s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:05:23.213590674 +0000 UTC m=+100.472568121" watchObservedRunningTime="2025-11-24 09:05:23.21421955 +0000 UTC m=+100.473196997" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.214263 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.214360 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.214375 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.214384 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:23Z","lastTransitionTime":"2025-11-24T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.231964 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nw8xd" podStartSLOduration=78.231948193 podStartE2EDuration="1m18.231948193s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:05:23.223665172 +0000 UTC m=+100.482642619" watchObservedRunningTime="2025-11-24 09:05:23.231948193 +0000 UTC m=+100.490925641" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.232555 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podStartSLOduration=78.23255137 podStartE2EDuration="1m18.23255137s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:05:23.231775358 +0000 UTC m=+100.490752804" watchObservedRunningTime="2025-11-24 09:05:23.23255137 +0000 UTC m=+100.491528818" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.316459 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.316500 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.316510 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.316524 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.316535 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:23Z","lastTransitionTime":"2025-11-24T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.417921 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.417949 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.417956 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.417967 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.417976 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:23Z","lastTransitionTime":"2025-11-24T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.520054 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.520091 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.520100 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.520111 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.520121 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:23Z","lastTransitionTime":"2025-11-24T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.622597 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.622662 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.622673 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.622686 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.622695 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:23Z","lastTransitionTime":"2025-11-24T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.724847 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.725137 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.725203 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.725263 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.725322 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:23Z","lastTransitionTime":"2025-11-24T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.827880 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.827988 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.828052 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.828112 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.828167 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:23Z","lastTransitionTime":"2025-11-24T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.930064 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.930170 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.930243 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.930306 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:23 crc kubenswrapper[4563]: I1124 09:05:23.930369 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:23Z","lastTransitionTime":"2025-11-24T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.032703 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.032752 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.032762 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.032775 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.032784 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:24Z","lastTransitionTime":"2025-11-24T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.054131 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.054162 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:24 crc kubenswrapper[4563]: E1124 09:05:24.054231 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:24 crc kubenswrapper[4563]: E1124 09:05:24.054320 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.054131 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:24 crc kubenswrapper[4563]: E1124 09:05:24.054405 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.135210 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.135260 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.135272 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.135287 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.135302 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:24Z","lastTransitionTime":"2025-11-24T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.237112 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.237151 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.237159 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.237173 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.237183 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:24Z","lastTransitionTime":"2025-11-24T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.338598 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.338652 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.338662 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.338675 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.338683 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:24Z","lastTransitionTime":"2025-11-24T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.441802 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.441838 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.441862 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.441877 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.441887 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:24Z","lastTransitionTime":"2025-11-24T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.541730 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.541767 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.541776 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.541790 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.541808 4563 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-24T09:05:24Z","lastTransitionTime":"2025-11-24T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.568207 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7qphz" podStartSLOduration=79.568189577 podStartE2EDuration="1m19.568189577s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:05:23.242840444 +0000 UTC m=+100.501817892" watchObservedRunningTime="2025-11-24 09:05:24.568189577 +0000 UTC m=+101.827167024" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.568707 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k"] Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.569149 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.570315 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.570570 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.570656 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.572163 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.741148 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/08ed6e4c-2b2b-4e46-b409-70e1042b100e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.741210 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/08ed6e4c-2b2b-4e46-b409-70e1042b100e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.741234 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08ed6e4c-2b2b-4e46-b409-70e1042b100e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.741267 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08ed6e4c-2b2b-4e46-b409-70e1042b100e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.741286 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08ed6e4c-2b2b-4e46-b409-70e1042b100e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.842074 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/08ed6e4c-2b2b-4e46-b409-70e1042b100e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.842104 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08ed6e4c-2b2b-4e46-b409-70e1042b100e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.842123 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08ed6e4c-2b2b-4e46-b409-70e1042b100e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.842149 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08ed6e4c-2b2b-4e46-b409-70e1042b100e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.842175 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/08ed6e4c-2b2b-4e46-b409-70e1042b100e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.842213 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/08ed6e4c-2b2b-4e46-b409-70e1042b100e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.842190 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/08ed6e4c-2b2b-4e46-b409-70e1042b100e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.842983 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/08ed6e4c-2b2b-4e46-b409-70e1042b100e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.847045 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08ed6e4c-2b2b-4e46-b409-70e1042b100e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.855362 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08ed6e4c-2b2b-4e46-b409-70e1042b100e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7b26k\" (UID: \"08ed6e4c-2b2b-4e46-b409-70e1042b100e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:24 crc kubenswrapper[4563]: I1124 09:05:24.879778 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" Nov 24 09:05:25 crc kubenswrapper[4563]: I1124 09:05:25.053908 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:25 crc kubenswrapper[4563]: E1124 09:05:25.054177 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:25 crc kubenswrapper[4563]: I1124 09:05:25.441827 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" event={"ID":"08ed6e4c-2b2b-4e46-b409-70e1042b100e","Type":"ContainerStarted","Data":"b1405d3f5e5c4af084d85b23ba391a018cd2b10ef10e85ce17a0ebba37581b0b"} Nov 24 09:05:25 crc kubenswrapper[4563]: I1124 09:05:25.441879 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" event={"ID":"08ed6e4c-2b2b-4e46-b409-70e1042b100e","Type":"ContainerStarted","Data":"e8091e5882f381c3d99740911cb9664d5b29a5e70f1171a68b5e4489238ba96a"} Nov 24 09:05:25 crc kubenswrapper[4563]: I1124 09:05:25.452597 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7b26k" podStartSLOduration=80.452582484 podStartE2EDuration="1m20.452582484s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:05:25.451942717 +0000 UTC m=+102.710920165" watchObservedRunningTime="2025-11-24 09:05:25.452582484 +0000 UTC m=+102.711559931" Nov 24 09:05:26 crc kubenswrapper[4563]: I1124 09:05:26.053804 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:26 crc kubenswrapper[4563]: E1124 09:05:26.054085 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:26 crc kubenswrapper[4563]: I1124 09:05:26.053854 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:26 crc kubenswrapper[4563]: I1124 09:05:26.053842 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:26 crc kubenswrapper[4563]: E1124 09:05:26.054154 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:26 crc kubenswrapper[4563]: E1124 09:05:26.054253 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:27 crc kubenswrapper[4563]: I1124 09:05:27.053996 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:27 crc kubenswrapper[4563]: E1124 09:05:27.054103 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:28 crc kubenswrapper[4563]: I1124 09:05:28.053885 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:28 crc kubenswrapper[4563]: I1124 09:05:28.053980 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:28 crc kubenswrapper[4563]: E1124 09:05:28.054004 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:28 crc kubenswrapper[4563]: E1124 09:05:28.054103 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:28 crc kubenswrapper[4563]: I1124 09:05:28.053985 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:28 crc kubenswrapper[4563]: E1124 09:05:28.054183 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:29 crc kubenswrapper[4563]: I1124 09:05:29.054415 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:29 crc kubenswrapper[4563]: E1124 09:05:29.054555 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:30 crc kubenswrapper[4563]: I1124 09:05:30.053774 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:30 crc kubenswrapper[4563]: I1124 09:05:30.053777 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:30 crc kubenswrapper[4563]: I1124 09:05:30.053859 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:30 crc kubenswrapper[4563]: E1124 09:05:30.053964 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:30 crc kubenswrapper[4563]: E1124 09:05:30.054021 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:30 crc kubenswrapper[4563]: E1124 09:05:30.054056 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:30 crc kubenswrapper[4563]: I1124 09:05:30.054532 4563 scope.go:117] "RemoveContainer" containerID="d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772" Nov 24 09:05:30 crc kubenswrapper[4563]: E1124 09:05:30.054686 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vgbgr_openshift-ovn-kubernetes(cee9b713-10b0-49a5-841d-fbb083faba9a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" Nov 24 09:05:31 crc kubenswrapper[4563]: I1124 09:05:31.054165 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:31 crc kubenswrapper[4563]: E1124 09:05:31.054265 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:32 crc kubenswrapper[4563]: I1124 09:05:32.054401 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:32 crc kubenswrapper[4563]: I1124 09:05:32.054437 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:32 crc kubenswrapper[4563]: E1124 09:05:32.054500 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:32 crc kubenswrapper[4563]: I1124 09:05:32.054398 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:32 crc kubenswrapper[4563]: E1124 09:05:32.054710 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:32 crc kubenswrapper[4563]: E1124 09:05:32.054898 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:33 crc kubenswrapper[4563]: I1124 09:05:33.054455 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:33 crc kubenswrapper[4563]: E1124 09:05:33.055244 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:34 crc kubenswrapper[4563]: I1124 09:05:34.054489 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:34 crc kubenswrapper[4563]: I1124 09:05:34.054556 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:34 crc kubenswrapper[4563]: E1124 09:05:34.054596 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:34 crc kubenswrapper[4563]: E1124 09:05:34.054660 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:34 crc kubenswrapper[4563]: I1124 09:05:34.054495 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:34 crc kubenswrapper[4563]: E1124 09:05:34.054734 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:35 crc kubenswrapper[4563]: I1124 09:05:35.054399 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:35 crc kubenswrapper[4563]: E1124 09:05:35.054512 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:36 crc kubenswrapper[4563]: I1124 09:05:36.054276 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:36 crc kubenswrapper[4563]: I1124 09:05:36.054310 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:36 crc kubenswrapper[4563]: I1124 09:05:36.054346 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:36 crc kubenswrapper[4563]: E1124 09:05:36.054394 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:36 crc kubenswrapper[4563]: E1124 09:05:36.054470 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:36 crc kubenswrapper[4563]: E1124 09:05:36.054561 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:37 crc kubenswrapper[4563]: I1124 09:05:37.054659 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:37 crc kubenswrapper[4563]: E1124 09:05:37.054759 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:38 crc kubenswrapper[4563]: I1124 09:05:38.054183 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:38 crc kubenswrapper[4563]: I1124 09:05:38.054222 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:38 crc kubenswrapper[4563]: E1124 09:05:38.054271 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:38 crc kubenswrapper[4563]: E1124 09:05:38.054351 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:38 crc kubenswrapper[4563]: I1124 09:05:38.054198 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:38 crc kubenswrapper[4563]: E1124 09:05:38.054591 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:39 crc kubenswrapper[4563]: I1124 09:05:39.053728 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:39 crc kubenswrapper[4563]: E1124 09:05:39.053843 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:39 crc kubenswrapper[4563]: I1124 09:05:39.473227 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nw8xd_019bd805-9123-494a-bb29-f39b924e6243/kube-multus/1.log" Nov 24 09:05:39 crc kubenswrapper[4563]: I1124 09:05:39.473590 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nw8xd_019bd805-9123-494a-bb29-f39b924e6243/kube-multus/0.log" Nov 24 09:05:39 crc kubenswrapper[4563]: I1124 09:05:39.473632 4563 generic.go:334] "Generic (PLEG): container finished" podID="019bd805-9123-494a-bb29-f39b924e6243" containerID="381c5f62c655111b7df341bae96a5edef6bcd2d5c3a8758d07465c278445bb8a" exitCode=1 Nov 24 09:05:39 crc kubenswrapper[4563]: I1124 09:05:39.473666 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nw8xd" event={"ID":"019bd805-9123-494a-bb29-f39b924e6243","Type":"ContainerDied","Data":"381c5f62c655111b7df341bae96a5edef6bcd2d5c3a8758d07465c278445bb8a"} Nov 24 09:05:39 crc kubenswrapper[4563]: I1124 09:05:39.473714 4563 scope.go:117] "RemoveContainer" containerID="6e39aaa733f68965e909b68dae6958cc77d1a5eeca18377e91fac771c2ee959b" Nov 24 09:05:39 crc kubenswrapper[4563]: I1124 09:05:39.474263 4563 scope.go:117] "RemoveContainer" containerID="381c5f62c655111b7df341bae96a5edef6bcd2d5c3a8758d07465c278445bb8a" Nov 24 09:05:39 crc kubenswrapper[4563]: E1124 09:05:39.474509 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-nw8xd_openshift-multus(019bd805-9123-494a-bb29-f39b924e6243)\"" pod="openshift-multus/multus-nw8xd" podUID="019bd805-9123-494a-bb29-f39b924e6243" Nov 24 09:05:40 crc kubenswrapper[4563]: I1124 09:05:40.054447 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:40 crc kubenswrapper[4563]: I1124 09:05:40.054487 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:40 crc kubenswrapper[4563]: I1124 09:05:40.054512 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:40 crc kubenswrapper[4563]: E1124 09:05:40.054540 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:40 crc kubenswrapper[4563]: E1124 09:05:40.054605 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:40 crc kubenswrapper[4563]: E1124 09:05:40.054717 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:40 crc kubenswrapper[4563]: I1124 09:05:40.477030 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nw8xd_019bd805-9123-494a-bb29-f39b924e6243/kube-multus/1.log" Nov 24 09:05:41 crc kubenswrapper[4563]: I1124 09:05:41.054081 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:41 crc kubenswrapper[4563]: E1124 09:05:41.054398 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:41 crc kubenswrapper[4563]: I1124 09:05:41.054631 4563 scope.go:117] "RemoveContainer" containerID="d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772" Nov 24 09:05:41 crc kubenswrapper[4563]: I1124 09:05:41.481622 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/3.log" Nov 24 09:05:41 crc kubenswrapper[4563]: I1124 09:05:41.483804 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerStarted","Data":"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f"} Nov 24 09:05:41 crc kubenswrapper[4563]: I1124 09:05:41.484544 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:05:41 crc kubenswrapper[4563]: I1124 09:05:41.508330 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podStartSLOduration=96.508316093 podStartE2EDuration="1m36.508316093s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:05:41.508278883 +0000 UTC m=+118.767256330" watchObservedRunningTime="2025-11-24 09:05:41.508316093 +0000 UTC m=+118.767293541" Nov 24 09:05:41 crc kubenswrapper[4563]: I1124 09:05:41.633754 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bsfsd"] Nov 24 09:05:41 crc kubenswrapper[4563]: I1124 09:05:41.633836 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:41 crc kubenswrapper[4563]: E1124 09:05:41.633908 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:42 crc kubenswrapper[4563]: I1124 09:05:42.053803 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:42 crc kubenswrapper[4563]: I1124 09:05:42.053866 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:42 crc kubenswrapper[4563]: E1124 09:05:42.053901 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:42 crc kubenswrapper[4563]: E1124 09:05:42.053960 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:42 crc kubenswrapper[4563]: I1124 09:05:42.053866 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:42 crc kubenswrapper[4563]: E1124 09:05:42.054036 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:43 crc kubenswrapper[4563]: E1124 09:05:43.031365 4563 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 24 09:05:43 crc kubenswrapper[4563]: E1124 09:05:43.121589 4563 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 24 09:05:44 crc kubenswrapper[4563]: I1124 09:05:44.053909 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:44 crc kubenswrapper[4563]: I1124 09:05:44.053914 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:44 crc kubenswrapper[4563]: I1124 09:05:44.053951 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:44 crc kubenswrapper[4563]: I1124 09:05:44.053988 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:44 crc kubenswrapper[4563]: E1124 09:05:44.054100 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:44 crc kubenswrapper[4563]: E1124 09:05:44.054170 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:44 crc kubenswrapper[4563]: E1124 09:05:44.054225 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:44 crc kubenswrapper[4563]: E1124 09:05:44.054266 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:46 crc kubenswrapper[4563]: I1124 09:05:46.054485 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:46 crc kubenswrapper[4563]: I1124 09:05:46.054590 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:46 crc kubenswrapper[4563]: E1124 09:05:46.054617 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:46 crc kubenswrapper[4563]: I1124 09:05:46.054489 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:46 crc kubenswrapper[4563]: I1124 09:05:46.054490 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:46 crc kubenswrapper[4563]: E1124 09:05:46.054739 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:46 crc kubenswrapper[4563]: E1124 09:05:46.054798 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:46 crc kubenswrapper[4563]: E1124 09:05:46.054887 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:48 crc kubenswrapper[4563]: I1124 09:05:48.053977 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:48 crc kubenswrapper[4563]: I1124 09:05:48.053977 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:48 crc kubenswrapper[4563]: I1124 09:05:48.053979 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:48 crc kubenswrapper[4563]: E1124 09:05:48.054981 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:48 crc kubenswrapper[4563]: E1124 09:05:48.054867 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:48 crc kubenswrapper[4563]: I1124 09:05:48.054004 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:48 crc kubenswrapper[4563]: E1124 09:05:48.055034 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:48 crc kubenswrapper[4563]: E1124 09:05:48.055111 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:48 crc kubenswrapper[4563]: E1124 09:05:48.123769 4563 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 24 09:05:50 crc kubenswrapper[4563]: I1124 09:05:50.054685 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:50 crc kubenswrapper[4563]: I1124 09:05:50.054725 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:50 crc kubenswrapper[4563]: E1124 09:05:50.054809 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:50 crc kubenswrapper[4563]: I1124 09:05:50.054695 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:50 crc kubenswrapper[4563]: E1124 09:05:50.054877 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:50 crc kubenswrapper[4563]: I1124 09:05:50.054907 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:50 crc kubenswrapper[4563]: E1124 09:05:50.055054 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:50 crc kubenswrapper[4563]: E1124 09:05:50.055132 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:52 crc kubenswrapper[4563]: I1124 09:05:52.054315 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:52 crc kubenswrapper[4563]: I1124 09:05:52.054338 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:52 crc kubenswrapper[4563]: I1124 09:05:52.054394 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:52 crc kubenswrapper[4563]: I1124 09:05:52.054392 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:52 crc kubenswrapper[4563]: E1124 09:05:52.054513 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:52 crc kubenswrapper[4563]: E1124 09:05:52.054616 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:52 crc kubenswrapper[4563]: E1124 09:05:52.054911 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:52 crc kubenswrapper[4563]: I1124 09:05:52.054924 4563 scope.go:117] "RemoveContainer" containerID="381c5f62c655111b7df341bae96a5edef6bcd2d5c3a8758d07465c278445bb8a" Nov 24 09:05:52 crc kubenswrapper[4563]: E1124 09:05:52.054956 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:52 crc kubenswrapper[4563]: I1124 09:05:52.511330 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nw8xd_019bd805-9123-494a-bb29-f39b924e6243/kube-multus/1.log" Nov 24 09:05:52 crc kubenswrapper[4563]: I1124 09:05:52.511587 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nw8xd" event={"ID":"019bd805-9123-494a-bb29-f39b924e6243","Type":"ContainerStarted","Data":"eb2ac8e61357886c955d8ea2e45d3e7697fed103f3408d6c13b3011e6f152b1c"} Nov 24 09:05:53 crc kubenswrapper[4563]: E1124 09:05:53.124150 4563 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 24 09:05:54 crc kubenswrapper[4563]: I1124 09:05:54.054183 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:54 crc kubenswrapper[4563]: I1124 09:05:54.054209 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:54 crc kubenswrapper[4563]: I1124 09:05:54.054223 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:54 crc kubenswrapper[4563]: E1124 09:05:54.054275 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:54 crc kubenswrapper[4563]: I1124 09:05:54.054343 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:54 crc kubenswrapper[4563]: E1124 09:05:54.054439 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:54 crc kubenswrapper[4563]: E1124 09:05:54.054485 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:54 crc kubenswrapper[4563]: E1124 09:05:54.054549 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:56 crc kubenswrapper[4563]: I1124 09:05:56.053603 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:56 crc kubenswrapper[4563]: I1124 09:05:56.053655 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:56 crc kubenswrapper[4563]: I1124 09:05:56.053693 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:56 crc kubenswrapper[4563]: E1124 09:05:56.053721 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:56 crc kubenswrapper[4563]: E1124 09:05:56.053803 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:05:56 crc kubenswrapper[4563]: E1124 09:05:56.053867 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:56 crc kubenswrapper[4563]: I1124 09:05:56.053915 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:56 crc kubenswrapper[4563]: E1124 09:05:56.054133 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:58 crc kubenswrapper[4563]: I1124 09:05:58.054096 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:05:58 crc kubenswrapper[4563]: I1124 09:05:58.054125 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:05:58 crc kubenswrapper[4563]: I1124 09:05:58.054159 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:05:58 crc kubenswrapper[4563]: I1124 09:05:58.054099 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:05:58 crc kubenswrapper[4563]: E1124 09:05:58.054222 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bsfsd" podUID="4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0" Nov 24 09:05:58 crc kubenswrapper[4563]: E1124 09:05:58.054347 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 24 09:05:58 crc kubenswrapper[4563]: E1124 09:05:58.054439 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 24 09:05:58 crc kubenswrapper[4563]: E1124 09:05:58.054601 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 24 09:06:00 crc kubenswrapper[4563]: I1124 09:06:00.054289 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:06:00 crc kubenswrapper[4563]: I1124 09:06:00.054336 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:06:00 crc kubenswrapper[4563]: I1124 09:06:00.054339 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:06:00 crc kubenswrapper[4563]: I1124 09:06:00.054306 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:06:00 crc kubenswrapper[4563]: I1124 09:06:00.056053 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 24 09:06:00 crc kubenswrapper[4563]: I1124 09:06:00.056194 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 24 09:06:00 crc kubenswrapper[4563]: I1124 09:06:00.056212 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 24 09:06:00 crc kubenswrapper[4563]: I1124 09:06:00.056238 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 24 09:06:00 crc kubenswrapper[4563]: I1124 09:06:00.056505 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 24 09:06:00 crc kubenswrapper[4563]: I1124 09:06:00.056523 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.225806 4563 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.250670 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.251036 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.251466 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g9q5v"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.251815 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.251845 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.252140 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.254149 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.254437 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.255277 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.255473 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.255590 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.257013 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nnx6h"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.257549 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.257855 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9bp54"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.258321 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.262793 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.262955 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.263110 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.263252 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.263517 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.263718 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.263764 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.263974 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.264028 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.264031 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.264155 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.264278 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.264350 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.264460 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.264549 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.267289 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.267495 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.268329 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jqkcz"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.268691 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269478 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c61447-7ef6-408e-8034-46506b36b5d2-serving-cert\") pod \"route-controller-manager-6576b87f9c-rtrwd\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269510 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269527 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c8ee9e7-2ff1-4be6-bd44-60193b7aed66-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lpv9k\" (UID: \"8c8ee9e7-2ff1-4be6-bd44-60193b7aed66\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269544 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np4h5\" (UniqueName: \"kubernetes.io/projected/8c8ee9e7-2ff1-4be6-bd44-60193b7aed66-kube-api-access-np4h5\") pod \"openshift-apiserver-operator-796bbdcf4f-lpv9k\" (UID: \"8c8ee9e7-2ff1-4be6-bd44-60193b7aed66\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269599 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-client-ca\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269631 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tkg5\" (UniqueName: \"kubernetes.io/projected/99e63a17-8605-4830-96b4-dd619cf76549-kube-api-access-4tkg5\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269665 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/728c9cea-9302-4856-95cc-2ea71352ec94-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fdqpk\" (UID: \"728c9cea-9302-4856-95cc-2ea71352ec94\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269679 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269694 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269706 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c61447-7ef6-408e-8034-46506b36b5d2-config\") pod \"route-controller-manager-6576b87f9c-rtrwd\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269722 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269742 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99e63a17-8605-4830-96b4-dd619cf76549-serving-cert\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269757 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xksh4\" (UniqueName: \"kubernetes.io/projected/27cfa44e-65c6-48d1-bafb-ce4806fc043b-kube-api-access-xksh4\") pod \"console-operator-58897d9998-9bp54\" (UID: \"27cfa44e-65c6-48d1-bafb-ce4806fc043b\") " pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269777 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c8ee9e7-2ff1-4be6-bd44-60193b7aed66-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lpv9k\" (UID: \"8c8ee9e7-2ff1-4be6-bd44-60193b7aed66\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269804 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-audit-policies\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269817 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269830 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269843 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7c61447-7ef6-408e-8034-46506b36b5d2-client-ca\") pod \"route-controller-manager-6576b87f9c-rtrwd\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269865 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw2j4\" (UniqueName: \"kubernetes.io/projected/e7c61447-7ef6-408e-8034-46506b36b5d2-kube-api-access-tw2j4\") pod \"route-controller-manager-6576b87f9c-rtrwd\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269879 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/245aea2a-4167-418a-910c-91bf4836c8dc-audit-dir\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269893 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269908 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269921 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27cfa44e-65c6-48d1-bafb-ce4806fc043b-config\") pod \"console-operator-58897d9998-9bp54\" (UID: \"27cfa44e-65c6-48d1-bafb-ce4806fc043b\") " pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269935 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27cfa44e-65c6-48d1-bafb-ce4806fc043b-trusted-ca\") pod \"console-operator-58897d9998-9bp54\" (UID: \"27cfa44e-65c6-48d1-bafb-ce4806fc043b\") " pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269948 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jk4l\" (UniqueName: \"kubernetes.io/projected/245aea2a-4167-418a-910c-91bf4836c8dc-kube-api-access-5jk4l\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269961 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4fx8\" (UniqueName: \"kubernetes.io/projected/728c9cea-9302-4856-95cc-2ea71352ec94-kube-api-access-k4fx8\") pod \"cluster-samples-operator-665b6dd947-fdqpk\" (UID: \"728c9cea-9302-4856-95cc-2ea71352ec94\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269974 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27cfa44e-65c6-48d1-bafb-ce4806fc043b-serving-cert\") pod \"console-operator-58897d9998-9bp54\" (UID: \"27cfa44e-65c6-48d1-bafb-ce4806fc043b\") " pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.269989 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-config\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.270004 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.270069 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.270101 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.270142 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.270719 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-l6b8r"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.271071 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.271155 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.271204 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.271207 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.271605 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.276510 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.276720 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.276744 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.276959 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.277029 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.277243 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.277379 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.277472 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.277510 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.277663 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.277673 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.277717 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.278028 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.278441 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.280748 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.281005 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.281964 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.282163 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.282208 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-tzmwx"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.282538 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tzmwx" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.284310 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.284461 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.284550 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.285833 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.286314 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.286627 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.286712 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.286787 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.287150 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.287386 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-7hx7w"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.287667 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.287731 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.287791 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.288000 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.288136 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.288440 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jmv44"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.288791 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.288874 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.288874 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.289012 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.289112 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.289194 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.289770 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.289857 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.289932 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.290011 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.290155 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.290236 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.290309 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.290376 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.290452 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.290530 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.294113 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.294444 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rshgn"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.294793 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bxdlb"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.294866 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rshgn" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.294867 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.303323 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.310364 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nsrtq"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.311028 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.311053 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mm4b8"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.311493 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.311762 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.295089 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.314461 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.302844 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.302946 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.303082 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.307158 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.307160 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.314276 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.314370 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.314417 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.314725 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.318370 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.318501 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.318605 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.318714 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.318723 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.318867 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.318937 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.323327 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.329687 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g9q5v"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.329778 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.330230 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.330308 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z4lb4"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.330739 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z4lb4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.330942 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.331023 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.331079 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.331314 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.331029 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.331678 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.332302 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.332497 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.332649 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.332777 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.332928 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.333044 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.334539 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.334848 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.335143 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.335213 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.336154 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.336286 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.337254 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.338249 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.341188 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.341621 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.342843 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.342849 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.343354 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vsmrn"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.343811 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsmrn" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.343929 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.344476 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.344518 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.344705 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fd8lt"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.344868 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.345057 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-fd8lt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.345829 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.346281 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.346754 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.346942 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.347243 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmp5c"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.347848 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.347880 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-r5h7f"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.348235 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.349146 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.349446 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.351409 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.351825 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.352132 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jj9mw"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.352591 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-jj9mw" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.352930 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.353355 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.354081 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.360590 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.361683 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.364076 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.364594 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.365201 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.370873 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.370945 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.371998 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zmp5c\" (UID: \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.372113 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxrqz\" (UniqueName: \"kubernetes.io/projected/2eb44334-ab46-4eca-a39f-7b289792b178-kube-api-access-pxrqz\") pod \"packageserver-d55dfcdfc-jv526\" (UID: \"2eb44334-ab46-4eca-a39f-7b289792b178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.372199 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/20670dac-a915-49d2-8953-cb842980ca87-encryption-config\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.372267 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48fdef5e-c65c-4898-af52-6ea141ab67b7-config\") pod \"kube-apiserver-operator-766d6c64bb-ppl5f\" (UID: \"48fdef5e-c65c-4898-af52-6ea141ab67b7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.372354 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-image-import-ca\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.372435 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/20670dac-a915-49d2-8953-cb842980ca87-etcd-client\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.372512 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20670dac-a915-49d2-8953-cb842980ca87-serving-cert\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.373337 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-config\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.373415 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-etcd-serving-ca\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.373541 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99e63a17-8605-4830-96b4-dd619cf76549-serving-cert\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.373624 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xksh4\" (UniqueName: \"kubernetes.io/projected/27cfa44e-65c6-48d1-bafb-ce4806fc043b-kube-api-access-xksh4\") pod \"console-operator-58897d9998-9bp54\" (UID: \"27cfa44e-65c6-48d1-bafb-ce4806fc043b\") " pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.373681 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h967v\" (UniqueName: \"kubernetes.io/projected/378c7d30-dd7c-4aa5-83cf-7caca587f283-kube-api-access-h967v\") pod \"control-plane-machine-set-operator-78cbb6b69f-z4lb4\" (UID: \"378c7d30-dd7c-4aa5-83cf-7caca587f283\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z4lb4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.373748 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh8l6\" (UniqueName: \"kubernetes.io/projected/6f15c383-9eb5-4942-a63d-48e54beea23d-kube-api-access-rh8l6\") pod \"collect-profiles-29399580-sccd5\" (UID: \"6f15c383-9eb5-4942-a63d-48e54beea23d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.373775 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c8ee9e7-2ff1-4be6-bd44-60193b7aed66-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lpv9k\" (UID: \"8c8ee9e7-2ff1-4be6-bd44-60193b7aed66\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.374268 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twccj\" (UniqueName: \"kubernetes.io/projected/7aeea6e8-f475-47dc-8b80-fade6640c678-kube-api-access-twccj\") pod \"machine-config-operator-74547568cd-kj6k4\" (UID: \"7aeea6e8-f475-47dc-8b80-fade6640c678\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.374894 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f15c383-9eb5-4942-a63d-48e54beea23d-config-volume\") pod \"collect-profiles-29399580-sccd5\" (UID: \"6f15c383-9eb5-4942-a63d-48e54beea23d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.374990 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7c61447-7ef6-408e-8034-46506b36b5d2-client-ca\") pod \"route-controller-manager-6576b87f9c-rtrwd\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.375132 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-audit-policies\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.375234 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.375310 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.375391 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw2j4\" (UniqueName: \"kubernetes.io/projected/e7c61447-7ef6-408e-8034-46506b36b5d2-kube-api-access-tw2j4\") pod \"route-controller-manager-6576b87f9c-rtrwd\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.375503 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/48157749-8872-4c5b-b119-efe27cfd887e-images\") pod \"machine-api-operator-5694c8668f-bxdlb\" (UID: \"48157749-8872-4c5b-b119-efe27cfd887e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.375598 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/245aea2a-4167-418a-910c-91bf4836c8dc-audit-dir\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.375782 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.375862 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/898668ee-3043-4bdc-8e77-82c108bcc65d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zgnq6\" (UID: \"898668ee-3043-4bdc-8e77-82c108bcc65d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.376429 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nnx6h"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.376453 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.376933 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.376946 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.377086 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/245aea2a-4167-418a-910c-91bf4836c8dc-audit-dir\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.377458 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.377549 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27cfa44e-65c6-48d1-bafb-ce4806fc043b-config\") pod \"console-operator-58897d9998-9bp54\" (UID: \"27cfa44e-65c6-48d1-bafb-ce4806fc043b\") " pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.378246 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jk4l\" (UniqueName: \"kubernetes.io/projected/245aea2a-4167-418a-910c-91bf4836c8dc-kube-api-access-5jk4l\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.378318 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.378339 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27cfa44e-65c6-48d1-bafb-ce4806fc043b-trusted-ca\") pod \"console-operator-58897d9998-9bp54\" (UID: \"27cfa44e-65c6-48d1-bafb-ce4806fc043b\") " pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.378384 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-etcd-client\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.378454 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eb44334-ab46-4eca-a39f-7b289792b178-webhook-cert\") pod \"packageserver-d55dfcdfc-jv526\" (UID: \"2eb44334-ab46-4eca-a39f-7b289792b178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.378541 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27cfa44e-65c6-48d1-bafb-ce4806fc043b-config\") pod \"console-operator-58897d9998-9bp54\" (UID: \"27cfa44e-65c6-48d1-bafb-ce4806fc043b\") " pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.378865 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffvcw\" (UniqueName: \"kubernetes.io/projected/4e13a5b1-f9f7-4045-952a-a44cfd536a99-kube-api-access-ffvcw\") pod \"marketplace-operator-79b997595-zmp5c\" (UID: \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.378901 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdrch\" (UniqueName: \"kubernetes.io/projected/dcb29296-6b35-478f-9712-ef96d33867c2-kube-api-access-bdrch\") pod \"migrator-59844c95c7-vsmrn\" (UID: \"dcb29296-6b35-478f-9712-ef96d33867c2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsmrn" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.378972 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-config\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379040 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4fx8\" (UniqueName: \"kubernetes.io/projected/728c9cea-9302-4856-95cc-2ea71352ec94-kube-api-access-k4fx8\") pod \"cluster-samples-operator-665b6dd947-fdqpk\" (UID: \"728c9cea-9302-4856-95cc-2ea71352ec94\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379059 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27cfa44e-65c6-48d1-bafb-ce4806fc043b-serving-cert\") pod \"console-operator-58897d9998-9bp54\" (UID: \"27cfa44e-65c6-48d1-bafb-ce4806fc043b\") " pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379131 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/898668ee-3043-4bdc-8e77-82c108bcc65d-config\") pod \"kube-controller-manager-operator-78b949d7b-zgnq6\" (UID: \"898668ee-3043-4bdc-8e77-82c108bcc65d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379184 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07ac5545-11d9-470e-a4e0-1073333eebdc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wwmlv\" (UID: \"07ac5545-11d9-470e-a4e0-1073333eebdc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379207 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/20670dac-a915-49d2-8953-cb842980ca87-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379225 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20670dac-a915-49d2-8953-cb842980ca87-audit-dir\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379397 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379444 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-audit\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379461 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-serving-cert\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379478 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-audit-dir\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379520 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07ac5545-11d9-470e-a4e0-1073333eebdc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wwmlv\" (UID: \"07ac5545-11d9-470e-a4e0-1073333eebdc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379593 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379667 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-node-pullsecrets\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379696 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/378c7d30-dd7c-4aa5-83cf-7caca587f283-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-z4lb4\" (UID: \"378c7d30-dd7c-4aa5-83cf-7caca587f283\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z4lb4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379719 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ca055d5-b576-4aa7-bcb2-138156414ff0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xjp4v\" (UID: \"6ca055d5-b576-4aa7-bcb2-138156414ff0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379798 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/48157749-8872-4c5b-b119-efe27cfd887e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bxdlb\" (UID: \"48157749-8872-4c5b-b119-efe27cfd887e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.379861 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20670dac-a915-49d2-8953-cb842980ca87-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.380000 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.380046 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-trusted-ca-bundle\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.384842 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eb44334-ab46-4eca-a39f-7b289792b178-apiservice-cert\") pod \"packageserver-d55dfcdfc-jv526\" (UID: \"2eb44334-ab46-4eca-a39f-7b289792b178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.384867 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7aeea6e8-f475-47dc-8b80-fade6640c678-proxy-tls\") pod \"machine-config-operator-74547568cd-kj6k4\" (UID: \"7aeea6e8-f475-47dc-8b80-fade6640c678\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.384883 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a1768f84-8c65-48c1-beb8-1309d1d2e823-auth-proxy-config\") pod \"machine-approver-56656f9798-q4vst\" (UID: \"a1768f84-8c65-48c1-beb8-1309d1d2e823\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.384900 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48fdef5e-c65c-4898-af52-6ea141ab67b7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ppl5f\" (UID: \"48fdef5e-c65c-4898-af52-6ea141ab67b7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.384915 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48fdef5e-c65c-4898-af52-6ea141ab67b7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ppl5f\" (UID: \"48fdef5e-c65c-4898-af52-6ea141ab67b7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.381607 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-audit-policies\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.384936 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.384979 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7aeea6e8-f475-47dc-8b80-fade6640c678-images\") pod \"machine-config-operator-74547568cd-kj6k4\" (UID: \"7aeea6e8-f475-47dc-8b80-fade6640c678\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385007 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqzqs\" (UniqueName: \"kubernetes.io/projected/20670dac-a915-49d2-8953-cb842980ca87-kube-api-access-fqzqs\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385032 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c61447-7ef6-408e-8034-46506b36b5d2-serving-cert\") pod \"route-controller-manager-6576b87f9c-rtrwd\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385049 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zmp5c\" (UID: \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385065 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48157749-8872-4c5b-b119-efe27cfd887e-config\") pod \"machine-api-operator-5694c8668f-bxdlb\" (UID: \"48157749-8872-4c5b-b119-efe27cfd887e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385080 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07ac5545-11d9-470e-a4e0-1073333eebdc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wwmlv\" (UID: \"07ac5545-11d9-470e-a4e0-1073333eebdc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385099 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.381715 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27cfa44e-65c6-48d1-bafb-ce4806fc043b-trusted-ca\") pod \"console-operator-58897d9998-9bp54\" (UID: \"27cfa44e-65c6-48d1-bafb-ce4806fc043b\") " pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385134 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c8ee9e7-2ff1-4be6-bd44-60193b7aed66-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lpv9k\" (UID: \"8c8ee9e7-2ff1-4be6-bd44-60193b7aed66\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385165 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np4h5\" (UniqueName: \"kubernetes.io/projected/8c8ee9e7-2ff1-4be6-bd44-60193b7aed66-kube-api-access-np4h5\") pod \"openshift-apiserver-operator-796bbdcf4f-lpv9k\" (UID: \"8c8ee9e7-2ff1-4be6-bd44-60193b7aed66\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385187 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1768f84-8c65-48c1-beb8-1309d1d2e823-config\") pod \"machine-approver-56656f9798-q4vst\" (UID: \"a1768f84-8c65-48c1-beb8-1309d1d2e823\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.382305 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.384793 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7c61447-7ef6-408e-8034-46506b36b5d2-client-ca\") pod \"route-controller-manager-6576b87f9c-rtrwd\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385206 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzr8v\" (UniqueName: \"kubernetes.io/projected/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-kube-api-access-fzr8v\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.382121 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jmv44"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385247 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/898668ee-3043-4bdc-8e77-82c108bcc65d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zgnq6\" (UID: \"898668ee-3043-4bdc-8e77-82c108bcc65d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385257 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jqkcz"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385271 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-l6b8r"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.382105 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-config\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385378 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs54w\" (UniqueName: \"kubernetes.io/projected/48157749-8872-4c5b-b119-efe27cfd887e-kube-api-access-xs54w\") pod \"machine-api-operator-5694c8668f-bxdlb\" (UID: \"48157749-8872-4c5b-b119-efe27cfd887e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385387 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mm4b8"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.385967 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c8ee9e7-2ff1-4be6-bd44-60193b7aed66-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lpv9k\" (UID: \"8c8ee9e7-2ff1-4be6-bd44-60193b7aed66\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.386151 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.386715 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.386850 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99e63a17-8605-4830-96b4-dd619cf76549-serving-cert\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387386 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387435 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgc69\" (UniqueName: \"kubernetes.io/projected/a1768f84-8c65-48c1-beb8-1309d1d2e823-kube-api-access-lgc69\") pod \"machine-approver-56656f9798-q4vst\" (UID: \"a1768f84-8c65-48c1-beb8-1309d1d2e823\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387469 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-client-ca\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387489 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f15c383-9eb5-4942-a63d-48e54beea23d-secret-volume\") pod \"collect-profiles-29399580-sccd5\" (UID: \"6f15c383-9eb5-4942-a63d-48e54beea23d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387513 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7aeea6e8-f475-47dc-8b80-fade6640c678-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kj6k4\" (UID: \"7aeea6e8-f475-47dc-8b80-fade6640c678\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387530 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzzfd\" (UniqueName: \"kubernetes.io/projected/6ca055d5-b576-4aa7-bcb2-138156414ff0-kube-api-access-gzzfd\") pod \"package-server-manager-789f6589d5-xjp4v\" (UID: \"6ca055d5-b576-4aa7-bcb2-138156414ff0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387545 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20670dac-a915-49d2-8953-cb842980ca87-audit-policies\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387568 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-encryption-config\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387597 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2eb44334-ab46-4eca-a39f-7b289792b178-tmpfs\") pod \"packageserver-d55dfcdfc-jv526\" (UID: \"2eb44334-ab46-4eca-a39f-7b289792b178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387616 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tkg5\" (UniqueName: \"kubernetes.io/projected/99e63a17-8605-4830-96b4-dd619cf76549-kube-api-access-4tkg5\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387662 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387678 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c61447-7ef6-408e-8034-46506b36b5d2-config\") pod \"route-controller-manager-6576b87f9c-rtrwd\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387697 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/728c9cea-9302-4856-95cc-2ea71352ec94-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fdqpk\" (UID: \"728c9cea-9302-4856-95cc-2ea71352ec94\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387710 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387727 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a1768f84-8c65-48c1-beb8-1309d1d2e823-machine-approver-tls\") pod \"machine-approver-56656f9798-q4vst\" (UID: \"a1768f84-8c65-48c1-beb8-1309d1d2e823\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387744 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.387763 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xnff\" (UniqueName: \"kubernetes.io/projected/b5a091e0-6549-4f21-a0dc-5f7452dc9c0f-kube-api-access-4xnff\") pod \"downloads-7954f5f757-tzmwx\" (UID: \"b5a091e0-6549-4f21-a0dc-5f7452dc9c0f\") " pod="openshift-console/downloads-7954f5f757-tzmwx" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.388474 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7hx7w"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.388508 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.389168 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.389413 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c61447-7ef6-408e-8034-46506b36b5d2-config\") pod \"route-controller-manager-6576b87f9c-rtrwd\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.389415 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c61447-7ef6-408e-8034-46506b36b5d2-serving-cert\") pod \"route-controller-manager-6576b87f9c-rtrwd\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.389427 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.389525 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.390336 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9bp54"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.390438 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.390490 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.391129 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27cfa44e-65c6-48d1-bafb-ce4806fc043b-serving-cert\") pod \"console-operator-58897d9998-9bp54\" (UID: \"27cfa44e-65c6-48d1-bafb-ce4806fc043b\") " pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.391371 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c8ee9e7-2ff1-4be6-bd44-60193b7aed66-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lpv9k\" (UID: \"8c8ee9e7-2ff1-4be6-bd44-60193b7aed66\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.391405 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wsk9h"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.391614 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/728c9cea-9302-4856-95cc-2ea71352ec94-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fdqpk\" (UID: \"728c9cea-9302-4856-95cc-2ea71352ec94\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.392003 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.392016 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.392354 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bxdlb"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.392554 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.393028 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-59266"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.393270 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-client-ca\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.393435 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wsk9h" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.393907 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-59266" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.394604 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.395901 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.396703 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.397557 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z4lb4"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.398343 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.399138 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmp5c"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.399949 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.401025 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.402949 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rshgn"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.403992 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fd8lt"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.404824 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nsrtq"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.405738 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.406925 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jj9mw"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.407963 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tzmwx"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.409182 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vsmrn"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.410136 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wsk9h"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.410995 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.412251 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.412691 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.413239 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.414378 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.415265 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.416339 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.417470 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-glp9q"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.418184 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-glp9q" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.418674 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-zq6xk"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.420343 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-zq6xk"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.420452 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.420765 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-glp9q"] Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.433146 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.452973 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.472529 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.488896 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25f557a3-e5cd-4355-9e52-b7542c0103d2-metrics-tls\") pod \"dns-operator-744455d44c-rshgn\" (UID: \"25f557a3-e5cd-4355-9e52-b7542c0103d2\") " pod="openshift-dns-operator/dns-operator-744455d44c-rshgn" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.489055 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b028b276-09cb-4d47-af70-1790128259df-serving-cert\") pod \"service-ca-operator-777779d784-9b6t7\" (UID: \"b028b276-09cb-4d47-af70-1790128259df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.489175 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48fdef5e-c65c-4898-af52-6ea141ab67b7-config\") pod \"kube-apiserver-operator-766d6c64bb-ppl5f\" (UID: \"48fdef5e-c65c-4898-af52-6ea141ab67b7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.489292 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zmp5c\" (UID: \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.489423 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/78e9b53f-4e52-4b94-8685-d5e84fb0b9cd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-92skl\" (UID: \"78e9b53f-4e52-4b94-8685-d5e84fb0b9cd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.489537 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/20670dac-a915-49d2-8953-cb842980ca87-etcd-client\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.489684 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-config\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.489831 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fed196e5-1e64-4d16-b63f-297eac90a06d-service-ca-bundle\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.489922 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78e9b53f-4e52-4b94-8685-d5e84fb0b9cd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-92skl\" (UID: \"78e9b53f-4e52-4b94-8685-d5e84fb0b9cd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.490014 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-serving-cert\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.490112 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f15c383-9eb5-4942-a63d-48e54beea23d-config-volume\") pod \"collect-profiles-29399580-sccd5\" (UID: \"6f15c383-9eb5-4942-a63d-48e54beea23d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.490236 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twccj\" (UniqueName: \"kubernetes.io/projected/7aeea6e8-f475-47dc-8b80-fade6640c678-kube-api-access-twccj\") pod \"machine-config-operator-74547568cd-kj6k4\" (UID: \"7aeea6e8-f475-47dc-8b80-fade6640c678\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.490296 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-config\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.490373 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fed196e5-1e64-4d16-b63f-297eac90a06d-default-certificate\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.490461 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/898668ee-3043-4bdc-8e77-82c108bcc65d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zgnq6\" (UID: \"898668ee-3043-4bdc-8e77-82c108bcc65d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.490591 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-serving-cert\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.490740 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxlvj\" (UniqueName: \"kubernetes.io/projected/8d287fb4-5d89-41ef-953d-92afcb5f33d3-kube-api-access-jxlvj\") pod \"olm-operator-6b444d44fb-5mx8n\" (UID: \"8d287fb4-5d89-41ef-953d-92afcb5f33d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.490858 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-config\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.490951 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/28e525e8-966c-4b8a-b9bb-064bfb18b592-srv-cert\") pod \"catalog-operator-68c6474976-jbgzc\" (UID: \"28e525e8-966c-4b8a-b9bb-064bfb18b592\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.491013 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78e9b53f-4e52-4b94-8685-d5e84fb0b9cd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-92skl\" (UID: \"78e9b53f-4e52-4b94-8685-d5e84fb0b9cd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.491156 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-oauth-serving-cert\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.491238 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/898668ee-3043-4bdc-8e77-82c108bcc65d-config\") pod \"kube-controller-manager-operator-78b949d7b-zgnq6\" (UID: \"898668ee-3043-4bdc-8e77-82c108bcc65d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.491304 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07ac5545-11d9-470e-a4e0-1073333eebdc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wwmlv\" (UID: \"07ac5545-11d9-470e-a4e0-1073333eebdc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.491375 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b64da0-d6d7-44d6-9be9-d1120a019e02-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-54kkk\" (UID: \"74b64da0-d6d7-44d6-9be9-d1120a019e02\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.491443 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/20670dac-a915-49d2-8953-cb842980ca87-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.491511 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07ac5545-11d9-470e-a4e0-1073333eebdc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wwmlv\" (UID: \"07ac5545-11d9-470e-a4e0-1073333eebdc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.491597 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbjxc\" (UniqueName: \"kubernetes.io/projected/28e525e8-966c-4b8a-b9bb-064bfb18b592-kube-api-access-zbjxc\") pod \"catalog-operator-68c6474976-jbgzc\" (UID: \"28e525e8-966c-4b8a-b9bb-064bfb18b592\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.491689 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-audit\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.491796 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-serving-cert\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.491889 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-audit-dir\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.491956 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ca055d5-b576-4aa7-bcb2-138156414ff0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xjp4v\" (UID: \"6ca055d5-b576-4aa7-bcb2-138156414ff0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.492047 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/378c7d30-dd7c-4aa5-83cf-7caca587f283-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-z4lb4\" (UID: \"378c7d30-dd7c-4aa5-83cf-7caca587f283\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z4lb4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.492122 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/48157749-8872-4c5b-b119-efe27cfd887e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bxdlb\" (UID: \"48157749-8872-4c5b-b119-efe27cfd887e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.492176 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/20670dac-a915-49d2-8953-cb842980ca87-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.492049 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-audit-dir\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.492274 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-etcd-ca\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.492403 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8d287fb4-5d89-41ef-953d-92afcb5f33d3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5mx8n\" (UID: \"8d287fb4-5d89-41ef-953d-92afcb5f33d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.492495 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-trusted-ca-bundle\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.492646 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-etcd-client\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.492749 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a1768f84-8c65-48c1-beb8-1309d1d2e823-auth-proxy-config\") pod \"machine-approver-56656f9798-q4vst\" (UID: \"a1768f84-8c65-48c1-beb8-1309d1d2e823\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.492844 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9a97b3-2f1d-4159-b128-a8e34f58f55b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6l6vs\" (UID: \"3d9a97b3-2f1d-4159-b128-a8e34f58f55b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.492936 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.492524 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-audit\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.492964 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fed196e5-1e64-4d16-b63f-297eac90a06d-stats-auth\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493127 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7aeea6e8-f475-47dc-8b80-fade6640c678-images\") pod \"machine-config-operator-74547568cd-kj6k4\" (UID: \"7aeea6e8-f475-47dc-8b80-fade6640c678\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493228 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48fdef5e-c65c-4898-af52-6ea141ab67b7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ppl5f\" (UID: \"48fdef5e-c65c-4898-af52-6ea141ab67b7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493292 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e584f76f-d222-42b7-bfad-8190793ade5c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493370 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-275wc\" (UniqueName: \"kubernetes.io/projected/e11ba0a3-483c-4306-a89f-61f79a52b10d-kube-api-access-275wc\") pod \"multus-admission-controller-857f4d67dd-jj9mw\" (UID: \"e11ba0a3-483c-4306-a89f-61f79a52b10d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jj9mw" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493337 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-trusted-ca-bundle\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493438 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-trusted-ca-bundle\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493552 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48157749-8872-4c5b-b119-efe27cfd887e-config\") pod \"machine-api-operator-5694c8668f-bxdlb\" (UID: \"48157749-8872-4c5b-b119-efe27cfd887e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493631 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/898668ee-3043-4bdc-8e77-82c108bcc65d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zgnq6\" (UID: \"898668ee-3043-4bdc-8e77-82c108bcc65d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493667 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a1768f84-8c65-48c1-beb8-1309d1d2e823-auth-proxy-config\") pod \"machine-approver-56656f9798-q4vst\" (UID: \"a1768f84-8c65-48c1-beb8-1309d1d2e823\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493700 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzr8v\" (UniqueName: \"kubernetes.io/projected/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-kube-api-access-fzr8v\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493769 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f15c383-9eb5-4942-a63d-48e54beea23d-secret-volume\") pod \"collect-profiles-29399580-sccd5\" (UID: \"6f15c383-9eb5-4942-a63d-48e54beea23d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493832 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs54w\" (UniqueName: \"kubernetes.io/projected/48157749-8872-4c5b-b119-efe27cfd887e-kube-api-access-xs54w\") pod \"machine-api-operator-5694c8668f-bxdlb\" (UID: \"48157749-8872-4c5b-b119-efe27cfd887e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493865 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-encryption-config\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493909 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a1768f84-8c65-48c1-beb8-1309d1d2e823-machine-approver-tls\") pod \"machine-approver-56656f9798-q4vst\" (UID: \"a1768f84-8c65-48c1-beb8-1309d1d2e823\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493946 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9ph4\" (UniqueName: \"kubernetes.io/projected/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-kube-api-access-l9ph4\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.493973 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xnff\" (UniqueName: \"kubernetes.io/projected/b5a091e0-6549-4f21-a0dc-5f7452dc9c0f-kube-api-access-4xnff\") pod \"downloads-7954f5f757-tzmwx\" (UID: \"b5a091e0-6549-4f21-a0dc-5f7452dc9c0f\") " pod="openshift-console/downloads-7954f5f757-tzmwx" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494030 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-config\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494064 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e584f76f-d222-42b7-bfad-8190793ade5c-config\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494090 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxrqz\" (UniqueName: \"kubernetes.io/projected/2eb44334-ab46-4eca-a39f-7b289792b178-kube-api-access-pxrqz\") pod \"packageserver-d55dfcdfc-jv526\" (UID: \"2eb44334-ab46-4eca-a39f-7b289792b178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494114 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/20670dac-a915-49d2-8953-cb842980ca87-encryption-config\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494140 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-image-import-ca\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494163 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20670dac-a915-49d2-8953-cb842980ca87-serving-cert\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494252 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-oauth-config\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494297 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-etcd-serving-ca\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494356 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4dpq\" (UniqueName: \"kubernetes.io/projected/74b64da0-d6d7-44d6-9be9-d1120a019e02-kube-api-access-q4dpq\") pod \"openshift-controller-manager-operator-756b6f6bc6-54kkk\" (UID: \"74b64da0-d6d7-44d6-9be9-d1120a019e02\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494409 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fed196e5-1e64-4d16-b63f-297eac90a06d-metrics-certs\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494451 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xmpd\" (UniqueName: \"kubernetes.io/projected/b028b276-09cb-4d47-af70-1790128259df-kube-api-access-6xmpd\") pod \"service-ca-operator-777779d784-9b6t7\" (UID: \"b028b276-09cb-4d47-af70-1790128259df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494496 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h967v\" (UniqueName: \"kubernetes.io/projected/378c7d30-dd7c-4aa5-83cf-7caca587f283-kube-api-access-h967v\") pod \"control-plane-machine-set-operator-78cbb6b69f-z4lb4\" (UID: \"378c7d30-dd7c-4aa5-83cf-7caca587f283\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z4lb4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494530 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh8l6\" (UniqueName: \"kubernetes.io/projected/6f15c383-9eb5-4942-a63d-48e54beea23d-kube-api-access-rh8l6\") pod \"collect-profiles-29399580-sccd5\" (UID: \"6f15c383-9eb5-4942-a63d-48e54beea23d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494594 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9350d335-c1d0-4315-9da0-75a6bd635efc-serving-cert\") pod \"openshift-config-operator-7777fb866f-jmv44\" (UID: \"9350d335-c1d0-4315-9da0-75a6bd635efc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494676 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8d287fb4-5d89-41ef-953d-92afcb5f33d3-srv-cert\") pod \"olm-operator-6b444d44fb-5mx8n\" (UID: \"8d287fb4-5d89-41ef-953d-92afcb5f33d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494731 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhslv\" (UniqueName: \"kubernetes.io/projected/e584f76f-d222-42b7-bfad-8190793ade5c-kube-api-access-jhslv\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494848 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/48157749-8872-4c5b-b119-efe27cfd887e-images\") pod \"machine-api-operator-5694c8668f-bxdlb\" (UID: \"48157749-8872-4c5b-b119-efe27cfd887e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494899 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hrrj\" (UniqueName: \"kubernetes.io/projected/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-kube-api-access-5hrrj\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494941 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/28e525e8-966c-4b8a-b9bb-064bfb18b592-profile-collector-cert\") pod \"catalog-operator-68c6474976-jbgzc\" (UID: \"28e525e8-966c-4b8a-b9bb-064bfb18b592\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494976 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-service-ca\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495046 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-etcd-client\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495098 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eb44334-ab46-4eca-a39f-7b289792b178-webhook-cert\") pod \"packageserver-d55dfcdfc-jv526\" (UID: \"2eb44334-ab46-4eca-a39f-7b289792b178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495159 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffvcw\" (UniqueName: \"kubernetes.io/projected/4e13a5b1-f9f7-4045-952a-a44cfd536a99-kube-api-access-ffvcw\") pod \"marketplace-operator-79b997595-zmp5c\" (UID: \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495205 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdrch\" (UniqueName: \"kubernetes.io/projected/dcb29296-6b35-478f-9712-ef96d33867c2-kube-api-access-bdrch\") pod \"migrator-59844c95c7-vsmrn\" (UID: \"dcb29296-6b35-478f-9712-ef96d33867c2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsmrn" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495310 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-etcd-serving-ca\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.494490 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48157749-8872-4c5b-b119-efe27cfd887e-config\") pod \"machine-api-operator-5694c8668f-bxdlb\" (UID: \"48157749-8872-4c5b-b119-efe27cfd887e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495357 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-image-import-ca\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495474 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/48157749-8872-4c5b-b119-efe27cfd887e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bxdlb\" (UID: \"48157749-8872-4c5b-b119-efe27cfd887e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495491 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b028b276-09cb-4d47-af70-1790128259df-config\") pod \"service-ca-operator-777779d784-9b6t7\" (UID: \"b028b276-09cb-4d47-af70-1790128259df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495557 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20670dac-a915-49d2-8953-cb842980ca87-audit-dir\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495614 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20670dac-a915-49d2-8953-cb842980ca87-audit-dir\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495623 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-node-pullsecrets\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495681 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e584f76f-d222-42b7-bfad-8190793ade5c-serving-cert\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495722 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e584f76f-d222-42b7-bfad-8190793ade5c-service-ca-bundle\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495759 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eb44334-ab46-4eca-a39f-7b289792b178-apiservice-cert\") pod \"packageserver-d55dfcdfc-jv526\" (UID: \"2eb44334-ab46-4eca-a39f-7b289792b178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495783 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20670dac-a915-49d2-8953-cb842980ca87-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495805 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7aeea6e8-f475-47dc-8b80-fade6640c678-proxy-tls\") pod \"machine-config-operator-74547568cd-kj6k4\" (UID: \"7aeea6e8-f475-47dc-8b80-fade6640c678\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495824 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11ba0a3-483c-4306-a89f-61f79a52b10d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jj9mw\" (UID: \"e11ba0a3-483c-4306-a89f-61f79a52b10d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jj9mw" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495843 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48fdef5e-c65c-4898-af52-6ea141ab67b7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ppl5f\" (UID: \"48fdef5e-c65c-4898-af52-6ea141ab67b7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495855 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/48157749-8872-4c5b-b119-efe27cfd887e-images\") pod \"machine-api-operator-5694c8668f-bxdlb\" (UID: \"48157749-8872-4c5b-b119-efe27cfd887e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495866 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppqhd\" (UniqueName: \"kubernetes.io/projected/9350d335-c1d0-4315-9da0-75a6bd635efc-kube-api-access-ppqhd\") pod \"openshift-config-operator-7777fb866f-jmv44\" (UID: \"9350d335-c1d0-4315-9da0-75a6bd635efc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495955 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwp4h\" (UniqueName: \"kubernetes.io/projected/78e9b53f-4e52-4b94-8685-d5e84fb0b9cd-kube-api-access-lwp4h\") pod \"cluster-image-registry-operator-dc59b4c8b-92skl\" (UID: \"78e9b53f-4e52-4b94-8685-d5e84fb0b9cd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.495968 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-node-pullsecrets\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.496005 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqzqs\" (UniqueName: \"kubernetes.io/projected/20670dac-a915-49d2-8953-cb842980ca87-kube-api-access-fqzqs\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.496121 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20670dac-a915-49d2-8953-cb842980ca87-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.496210 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zmp5c\" (UID: \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.496241 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb7bc\" (UniqueName: \"kubernetes.io/projected/3d9a97b3-2f1d-4159-b128-a8e34f58f55b-kube-api-access-zb7bc\") pod \"kube-storage-version-migrator-operator-b67b599dd-6l6vs\" (UID: \"3d9a97b3-2f1d-4159-b128-a8e34f58f55b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.496263 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07ac5545-11d9-470e-a4e0-1073333eebdc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wwmlv\" (UID: \"07ac5545-11d9-470e-a4e0-1073333eebdc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.496280 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56kvf\" (UniqueName: \"kubernetes.io/projected/25f557a3-e5cd-4355-9e52-b7542c0103d2-kube-api-access-56kvf\") pod \"dns-operator-744455d44c-rshgn\" (UID: \"25f557a3-e5cd-4355-9e52-b7542c0103d2\") " pod="openshift-dns-operator/dns-operator-744455d44c-rshgn" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.496298 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9a97b3-2f1d-4159-b128-a8e34f58f55b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6l6vs\" (UID: \"3d9a97b3-2f1d-4159-b128-a8e34f58f55b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.496317 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1768f84-8c65-48c1-beb8-1309d1d2e823-config\") pod \"machine-approver-56656f9798-q4vst\" (UID: \"a1768f84-8c65-48c1-beb8-1309d1d2e823\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.496345 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzzfd\" (UniqueName: \"kubernetes.io/projected/6ca055d5-b576-4aa7-bcb2-138156414ff0-kube-api-access-gzzfd\") pod \"package-server-manager-789f6589d5-xjp4v\" (UID: \"6ca055d5-b576-4aa7-bcb2-138156414ff0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.496377 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-etcd-service-ca\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.496416 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7aeea6e8-f475-47dc-8b80-fade6640c678-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kj6k4\" (UID: \"7aeea6e8-f475-47dc-8b80-fade6640c678\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.496653 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgc69\" (UniqueName: \"kubernetes.io/projected/a1768f84-8c65-48c1-beb8-1309d1d2e823-kube-api-access-lgc69\") pod \"machine-approver-56656f9798-q4vst\" (UID: \"a1768f84-8c65-48c1-beb8-1309d1d2e823\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.496702 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2eb44334-ab46-4eca-a39f-7b289792b178-tmpfs\") pod \"packageserver-d55dfcdfc-jv526\" (UID: \"2eb44334-ab46-4eca-a39f-7b289792b178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.497545 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20670dac-a915-49d2-8953-cb842980ca87-audit-policies\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.497566 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfzs9\" (UniqueName: \"kubernetes.io/projected/fed196e5-1e64-4d16-b63f-297eac90a06d-kube-api-access-kfzs9\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.497843 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/20670dac-a915-49d2-8953-cb842980ca87-etcd-client\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.497393 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20670dac-a915-49d2-8953-cb842980ca87-serving-cert\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.497489 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7aeea6e8-f475-47dc-8b80-fade6640c678-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kj6k4\" (UID: \"7aeea6e8-f475-47dc-8b80-fade6640c678\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.496823 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-serving-cert\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.497167 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1768f84-8c65-48c1-beb8-1309d1d2e823-config\") pod \"machine-approver-56656f9798-q4vst\" (UID: \"a1768f84-8c65-48c1-beb8-1309d1d2e823\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.497322 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2eb44334-ab46-4eca-a39f-7b289792b178-tmpfs\") pod \"packageserver-d55dfcdfc-jv526\" (UID: \"2eb44334-ab46-4eca-a39f-7b289792b178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.498171 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20670dac-a915-49d2-8953-cb842980ca87-audit-policies\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.498452 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/20670dac-a915-49d2-8953-cb842980ca87-encryption-config\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.499474 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-encryption-config\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.499741 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-etcd-client\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.510180 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9350d335-c1d0-4315-9da0-75a6bd635efc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jmv44\" (UID: \"9350d335-c1d0-4315-9da0-75a6bd635efc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.510496 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74b64da0-d6d7-44d6-9be9-d1120a019e02-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-54kkk\" (UID: \"74b64da0-d6d7-44d6-9be9-d1120a019e02\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.511913 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a1768f84-8c65-48c1-beb8-1309d1d2e823-machine-approver-tls\") pod \"machine-approver-56656f9798-q4vst\" (UID: \"a1768f84-8c65-48c1-beb8-1309d1d2e823\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.513808 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.553077 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.572845 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.592799 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611187 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xmpd\" (UniqueName: \"kubernetes.io/projected/b028b276-09cb-4d47-af70-1790128259df-kube-api-access-6xmpd\") pod \"service-ca-operator-777779d784-9b6t7\" (UID: \"b028b276-09cb-4d47-af70-1790128259df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611220 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-oauth-config\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611240 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4dpq\" (UniqueName: \"kubernetes.io/projected/74b64da0-d6d7-44d6-9be9-d1120a019e02-kube-api-access-q4dpq\") pod \"openshift-controller-manager-operator-756b6f6bc6-54kkk\" (UID: \"74b64da0-d6d7-44d6-9be9-d1120a019e02\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611260 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fed196e5-1e64-4d16-b63f-297eac90a06d-metrics-certs\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611291 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9350d335-c1d0-4315-9da0-75a6bd635efc-serving-cert\") pod \"openshift-config-operator-7777fb866f-jmv44\" (UID: \"9350d335-c1d0-4315-9da0-75a6bd635efc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611311 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8d287fb4-5d89-41ef-953d-92afcb5f33d3-srv-cert\") pod \"olm-operator-6b444d44fb-5mx8n\" (UID: \"8d287fb4-5d89-41ef-953d-92afcb5f33d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611328 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhslv\" (UniqueName: \"kubernetes.io/projected/e584f76f-d222-42b7-bfad-8190793ade5c-kube-api-access-jhslv\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611360 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hrrj\" (UniqueName: \"kubernetes.io/projected/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-kube-api-access-5hrrj\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611389 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/28e525e8-966c-4b8a-b9bb-064bfb18b592-profile-collector-cert\") pod \"catalog-operator-68c6474976-jbgzc\" (UID: \"28e525e8-966c-4b8a-b9bb-064bfb18b592\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611405 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-service-ca\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611431 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b028b276-09cb-4d47-af70-1790128259df-config\") pod \"service-ca-operator-777779d784-9b6t7\" (UID: \"b028b276-09cb-4d47-af70-1790128259df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611455 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e584f76f-d222-42b7-bfad-8190793ade5c-service-ca-bundle\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611470 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e584f76f-d222-42b7-bfad-8190793ade5c-serving-cert\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611490 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11ba0a3-483c-4306-a89f-61f79a52b10d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jj9mw\" (UID: \"e11ba0a3-483c-4306-a89f-61f79a52b10d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jj9mw" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611517 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppqhd\" (UniqueName: \"kubernetes.io/projected/9350d335-c1d0-4315-9da0-75a6bd635efc-kube-api-access-ppqhd\") pod \"openshift-config-operator-7777fb866f-jmv44\" (UID: \"9350d335-c1d0-4315-9da0-75a6bd635efc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611534 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwp4h\" (UniqueName: \"kubernetes.io/projected/78e9b53f-4e52-4b94-8685-d5e84fb0b9cd-kube-api-access-lwp4h\") pod \"cluster-image-registry-operator-dc59b4c8b-92skl\" (UID: \"78e9b53f-4e52-4b94-8685-d5e84fb0b9cd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611564 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb7bc\" (UniqueName: \"kubernetes.io/projected/3d9a97b3-2f1d-4159-b128-a8e34f58f55b-kube-api-access-zb7bc\") pod \"kube-storage-version-migrator-operator-b67b599dd-6l6vs\" (UID: \"3d9a97b3-2f1d-4159-b128-a8e34f58f55b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611601 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56kvf\" (UniqueName: \"kubernetes.io/projected/25f557a3-e5cd-4355-9e52-b7542c0103d2-kube-api-access-56kvf\") pod \"dns-operator-744455d44c-rshgn\" (UID: \"25f557a3-e5cd-4355-9e52-b7542c0103d2\") " pod="openshift-dns-operator/dns-operator-744455d44c-rshgn" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611626 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9a97b3-2f1d-4159-b128-a8e34f58f55b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6l6vs\" (UID: \"3d9a97b3-2f1d-4159-b128-a8e34f58f55b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611672 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-etcd-service-ca\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611706 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfzs9\" (UniqueName: \"kubernetes.io/projected/fed196e5-1e64-4d16-b63f-297eac90a06d-kube-api-access-kfzs9\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611726 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74b64da0-d6d7-44d6-9be9-d1120a019e02-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-54kkk\" (UID: \"74b64da0-d6d7-44d6-9be9-d1120a019e02\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611743 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9350d335-c1d0-4315-9da0-75a6bd635efc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jmv44\" (UID: \"9350d335-c1d0-4315-9da0-75a6bd635efc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611759 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25f557a3-e5cd-4355-9e52-b7542c0103d2-metrics-tls\") pod \"dns-operator-744455d44c-rshgn\" (UID: \"25f557a3-e5cd-4355-9e52-b7542c0103d2\") " pod="openshift-dns-operator/dns-operator-744455d44c-rshgn" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611776 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b028b276-09cb-4d47-af70-1790128259df-serving-cert\") pod \"service-ca-operator-777779d784-9b6t7\" (UID: \"b028b276-09cb-4d47-af70-1790128259df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611798 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/78e9b53f-4e52-4b94-8685-d5e84fb0b9cd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-92skl\" (UID: \"78e9b53f-4e52-4b94-8685-d5e84fb0b9cd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611846 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fed196e5-1e64-4d16-b63f-297eac90a06d-service-ca-bundle\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611864 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78e9b53f-4e52-4b94-8685-d5e84fb0b9cd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-92skl\" (UID: \"78e9b53f-4e52-4b94-8685-d5e84fb0b9cd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611890 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-serving-cert\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611926 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fed196e5-1e64-4d16-b63f-297eac90a06d-default-certificate\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611950 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-serving-cert\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611968 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxlvj\" (UniqueName: \"kubernetes.io/projected/8d287fb4-5d89-41ef-953d-92afcb5f33d3-kube-api-access-jxlvj\") pod \"olm-operator-6b444d44fb-5mx8n\" (UID: \"8d287fb4-5d89-41ef-953d-92afcb5f33d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.611986 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-config\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612001 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/28e525e8-966c-4b8a-b9bb-064bfb18b592-srv-cert\") pod \"catalog-operator-68c6474976-jbgzc\" (UID: \"28e525e8-966c-4b8a-b9bb-064bfb18b592\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612017 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78e9b53f-4e52-4b94-8685-d5e84fb0b9cd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-92skl\" (UID: \"78e9b53f-4e52-4b94-8685-d5e84fb0b9cd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612034 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-oauth-serving-cert\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612061 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b64da0-d6d7-44d6-9be9-d1120a019e02-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-54kkk\" (UID: \"74b64da0-d6d7-44d6-9be9-d1120a019e02\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612083 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbjxc\" (UniqueName: \"kubernetes.io/projected/28e525e8-966c-4b8a-b9bb-064bfb18b592-kube-api-access-zbjxc\") pod \"catalog-operator-68c6474976-jbgzc\" (UID: \"28e525e8-966c-4b8a-b9bb-064bfb18b592\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612112 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-etcd-ca\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612135 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8d287fb4-5d89-41ef-953d-92afcb5f33d3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5mx8n\" (UID: \"8d287fb4-5d89-41ef-953d-92afcb5f33d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612149 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-etcd-client\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612167 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9a97b3-2f1d-4159-b128-a8e34f58f55b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6l6vs\" (UID: \"3d9a97b3-2f1d-4159-b128-a8e34f58f55b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612181 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fed196e5-1e64-4d16-b63f-297eac90a06d-stats-auth\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612205 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e584f76f-d222-42b7-bfad-8190793ade5c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612221 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-275wc\" (UniqueName: \"kubernetes.io/projected/e11ba0a3-483c-4306-a89f-61f79a52b10d-kube-api-access-275wc\") pod \"multus-admission-controller-857f4d67dd-jj9mw\" (UID: \"e11ba0a3-483c-4306-a89f-61f79a52b10d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jj9mw" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612237 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-trusted-ca-bundle\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612259 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-service-ca\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612287 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9ph4\" (UniqueName: \"kubernetes.io/projected/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-kube-api-access-l9ph4\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612311 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-config\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612326 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e584f76f-d222-42b7-bfad-8190793ade5c-config\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612495 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e584f76f-d222-42b7-bfad-8190793ade5c-service-ca-bundle\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.612939 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9350d335-c1d0-4315-9da0-75a6bd635efc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jmv44\" (UID: \"9350d335-c1d0-4315-9da0-75a6bd635efc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.613131 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-config\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.613528 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e584f76f-d222-42b7-bfad-8190793ade5c-config\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.613797 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78e9b53f-4e52-4b94-8685-d5e84fb0b9cd-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-92skl\" (UID: \"78e9b53f-4e52-4b94-8685-d5e84fb0b9cd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.613880 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-oauth-serving-cert\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.613987 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74b64da0-d6d7-44d6-9be9-d1120a019e02-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-54kkk\" (UID: \"74b64da0-d6d7-44d6-9be9-d1120a019e02\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.614394 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.614399 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-trusted-ca-bundle\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.614511 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-etcd-ca\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.615476 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b64da0-d6d7-44d6-9be9-d1120a019e02-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-54kkk\" (UID: \"74b64da0-d6d7-44d6-9be9-d1120a019e02\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.615906 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25f557a3-e5cd-4355-9e52-b7542c0103d2-metrics-tls\") pod \"dns-operator-744455d44c-rshgn\" (UID: \"25f557a3-e5cd-4355-9e52-b7542c0103d2\") " pod="openshift-dns-operator/dns-operator-744455d44c-rshgn" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.615976 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-serving-cert\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.616436 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e584f76f-d222-42b7-bfad-8190793ade5c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.616616 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-oauth-config\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.616908 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-serving-cert\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.617085 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e584f76f-d222-42b7-bfad-8190793ade5c-serving-cert\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.617175 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9350d335-c1d0-4315-9da0-75a6bd635efc-serving-cert\") pod \"openshift-config-operator-7777fb866f-jmv44\" (UID: \"9350d335-c1d0-4315-9da0-75a6bd635efc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.618553 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/78e9b53f-4e52-4b94-8685-d5e84fb0b9cd-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-92skl\" (UID: \"78e9b53f-4e52-4b94-8685-d5e84fb0b9cd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.623283 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-etcd-service-ca\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.634167 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.653305 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.663933 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-config\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.672326 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.693603 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.705918 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-etcd-client\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.713140 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.732856 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.752947 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.756657 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d9a97b3-2f1d-4159-b128-a8e34f58f55b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6l6vs\" (UID: \"3d9a97b3-2f1d-4159-b128-a8e34f58f55b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.772591 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.782847 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d9a97b3-2f1d-4159-b128-a8e34f58f55b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6l6vs\" (UID: \"3d9a97b3-2f1d-4159-b128-a8e34f58f55b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.793088 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.812977 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.832599 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.845054 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/378c7d30-dd7c-4aa5-83cf-7caca587f283-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-z4lb4\" (UID: \"378c7d30-dd7c-4aa5-83cf-7caca587f283\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z4lb4" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.853816 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.873233 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.883344 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b028b276-09cb-4d47-af70-1790128259df-config\") pod \"service-ca-operator-777779d784-9b6t7\" (UID: \"b028b276-09cb-4d47-af70-1790128259df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.892953 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.905349 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b028b276-09cb-4d47-af70-1790128259df-serving-cert\") pod \"service-ca-operator-777779d784-9b6t7\" (UID: \"b028b276-09cb-4d47-af70-1790128259df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.912827 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.933949 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.953307 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.973989 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 24 09:06:05 crc kubenswrapper[4563]: I1124 09:06:05.993427 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.005607 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ca055d5-b576-4aa7-bcb2-138156414ff0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xjp4v\" (UID: \"6ca055d5-b576-4aa7-bcb2-138156414ff0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.013071 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.033261 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.053247 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.072386 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.092866 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.098559 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2eb44334-ab46-4eca-a39f-7b289792b178-apiservice-cert\") pod \"packageserver-d55dfcdfc-jv526\" (UID: \"2eb44334-ab46-4eca-a39f-7b289792b178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.099187 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2eb44334-ab46-4eca-a39f-7b289792b178-webhook-cert\") pod \"packageserver-d55dfcdfc-jv526\" (UID: \"2eb44334-ab46-4eca-a39f-7b289792b178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.113306 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.133729 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.153514 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.164674 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07ac5545-11d9-470e-a4e0-1073333eebdc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wwmlv\" (UID: \"07ac5545-11d9-470e-a4e0-1073333eebdc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.173132 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.182106 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07ac5545-11d9-470e-a4e0-1073333eebdc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wwmlv\" (UID: \"07ac5545-11d9-470e-a4e0-1073333eebdc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.192690 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.212398 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.232973 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.252469 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.273295 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.293124 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.312801 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.316676 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f15c383-9eb5-4942-a63d-48e54beea23d-secret-volume\") pod \"collect-profiles-29399580-sccd5\" (UID: \"6f15c383-9eb5-4942-a63d-48e54beea23d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.316834 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8d287fb4-5d89-41ef-953d-92afcb5f33d3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5mx8n\" (UID: \"8d287fb4-5d89-41ef-953d-92afcb5f33d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.324923 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/28e525e8-966c-4b8a-b9bb-064bfb18b592-profile-collector-cert\") pod \"catalog-operator-68c6474976-jbgzc\" (UID: \"28e525e8-966c-4b8a-b9bb-064bfb18b592\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.333101 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.351496 4563 request.go:700] Waited for 1.004374358s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/secrets?fieldSelector=metadata.name%3Dingress-operator-dockercfg-7lnqk&limit=500&resourceVersion=0 Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.352543 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.380400 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.392573 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.412962 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.425924 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/28e525e8-966c-4b8a-b9bb-064bfb18b592-srv-cert\") pod \"catalog-operator-68c6474976-jbgzc\" (UID: \"28e525e8-966c-4b8a-b9bb-064bfb18b592\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.433237 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.452844 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.461859 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f15c383-9eb5-4942-a63d-48e54beea23d-config-volume\") pod \"collect-profiles-29399580-sccd5\" (UID: \"6f15c383-9eb5-4942-a63d-48e54beea23d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.472990 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.487254 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fed196e5-1e64-4d16-b63f-297eac90a06d-default-certificate\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.489818 4563 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.489910 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-trusted-ca podName:4e13a5b1-f9f7-4045-952a-a44cfd536a99 nodeName:}" failed. No retries permitted until 2025-11-24 09:06:06.98988885 +0000 UTC m=+144.248866297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-trusted-ca") pod "marketplace-operator-79b997595-zmp5c" (UID: "4e13a5b1-f9f7-4045-952a-a44cfd536a99") : failed to sync configmap cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.490008 4563 configmap.go:193] Couldn't get configMap openshift-kube-apiserver-operator/kube-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.490145 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/48fdef5e-c65c-4898-af52-6ea141ab67b7-config podName:48fdef5e-c65c-4898-af52-6ea141ab67b7 nodeName:}" failed. No retries permitted until 2025-11-24 09:06:06.990121949 +0000 UTC m=+144.249099396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/48fdef5e-c65c-4898-af52-6ea141ab67b7-config") pod "kube-apiserver-operator-766d6c64bb-ppl5f" (UID: "48fdef5e-c65c-4898-af52-6ea141ab67b7") : failed to sync configmap cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.490934 4563 secret.go:188] Couldn't get secret openshift-kube-controller-manager-operator/kube-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.490991 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/898668ee-3043-4bdc-8e77-82c108bcc65d-serving-cert podName:898668ee-3043-4bdc-8e77-82c108bcc65d nodeName:}" failed. No retries permitted until 2025-11-24 09:06:06.990977111 +0000 UTC m=+144.249954559 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/898668ee-3043-4bdc-8e77-82c108bcc65d-serving-cert") pod "kube-controller-manager-operator-78b949d7b-zgnq6" (UID: "898668ee-3043-4bdc-8e77-82c108bcc65d") : failed to sync secret cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.492071 4563 configmap.go:193] Couldn't get configMap openshift-kube-controller-manager-operator/kube-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.492143 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/898668ee-3043-4bdc-8e77-82c108bcc65d-config podName:898668ee-3043-4bdc-8e77-82c108bcc65d nodeName:}" failed. No retries permitted until 2025-11-24 09:06:06.992128613 +0000 UTC m=+144.251106061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/898668ee-3043-4bdc-8e77-82c108bcc65d-config") pod "kube-controller-manager-operator-78b949d7b-zgnq6" (UID: "898668ee-3043-4bdc-8e77-82c108bcc65d") : failed to sync configmap cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.493247 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.493379 4563 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.493418 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7aeea6e8-f475-47dc-8b80-fade6640c678-images podName:7aeea6e8-f475-47dc-8b80-fade6640c678 nodeName:}" failed. No retries permitted until 2025-11-24 09:06:06.99340973 +0000 UTC m=+144.252387177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/7aeea6e8-f475-47dc-8b80-fade6640c678-images") pod "machine-config-operator-74547568cd-kj6k4" (UID: "7aeea6e8-f475-47dc-8b80-fade6640c678") : failed to sync configmap cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.496007 4563 secret.go:188] Couldn't get secret openshift-kube-apiserver-operator/kube-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.496035 4563 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.496066 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48fdef5e-c65c-4898-af52-6ea141ab67b7-serving-cert podName:48fdef5e-c65c-4898-af52-6ea141ab67b7 nodeName:}" failed. No retries permitted until 2025-11-24 09:06:06.996054057 +0000 UTC m=+144.255031504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/48fdef5e-c65c-4898-af52-6ea141ab67b7-serving-cert") pod "kube-apiserver-operator-766d6c64bb-ppl5f" (UID: "48fdef5e-c65c-4898-af52-6ea141ab67b7") : failed to sync secret cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.496085 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aeea6e8-f475-47dc-8b80-fade6640c678-proxy-tls podName:7aeea6e8-f475-47dc-8b80-fade6640c678 nodeName:}" failed. No retries permitted until 2025-11-24 09:06:06.996077852 +0000 UTC m=+144.255055299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7aeea6e8-f475-47dc-8b80-fade6640c678-proxy-tls") pod "machine-config-operator-74547568cd-kj6k4" (UID: "7aeea6e8-f475-47dc-8b80-fade6640c678") : failed to sync secret cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.497130 4563 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.497191 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-operator-metrics podName:4e13a5b1-f9f7-4045-952a-a44cfd536a99 nodeName:}" failed. No retries permitted until 2025-11-24 09:06:06.997176274 +0000 UTC m=+144.256153721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-operator-metrics") pod "marketplace-operator-79b997595-zmp5c" (UID: "4e13a5b1-f9f7-4045-952a-a44cfd536a99") : failed to sync secret cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.505485 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fed196e5-1e64-4d16-b63f-297eac90a06d-stats-auth\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.513364 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.524830 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fed196e5-1e64-4d16-b63f-297eac90a06d-metrics-certs\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.533123 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.552472 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.563215 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fed196e5-1e64-4d16-b63f-297eac90a06d-service-ca-bundle\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.572821 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.593142 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.612408 4563 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.612473 4563 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.612501 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d287fb4-5d89-41ef-953d-92afcb5f33d3-srv-cert podName:8d287fb4-5d89-41ef-953d-92afcb5f33d3 nodeName:}" failed. No retries permitted until 2025-11-24 09:06:07.112473741 +0000 UTC m=+144.371451188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/8d287fb4-5d89-41ef-953d-92afcb5f33d3-srv-cert") pod "olm-operator-6b444d44fb-5mx8n" (UID: "8d287fb4-5d89-41ef-953d-92afcb5f33d3") : failed to sync secret cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: E1124 09:06:06.612525 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e11ba0a3-483c-4306-a89f-61f79a52b10d-webhook-certs podName:e11ba0a3-483c-4306-a89f-61f79a52b10d nodeName:}" failed. No retries permitted until 2025-11-24 09:06:07.112511874 +0000 UTC m=+144.371489331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e11ba0a3-483c-4306-a89f-61f79a52b10d-webhook-certs") pod "multus-admission-controller-857f4d67dd-jj9mw" (UID: "e11ba0a3-483c-4306-a89f-61f79a52b10d") : failed to sync secret cache: timed out waiting for the condition Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.612661 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.633388 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.652900 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.677515 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.693566 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.712599 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.732500 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.752854 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.773181 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.792614 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.813198 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.832481 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.852928 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.873066 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.892853 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.913062 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.932717 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.952886 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.972674 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 24 09:06:06 crc kubenswrapper[4563]: I1124 09:06:06.993490 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.013044 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.030667 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7aeea6e8-f475-47dc-8b80-fade6640c678-proxy-tls\") pod \"machine-config-operator-74547568cd-kj6k4\" (UID: \"7aeea6e8-f475-47dc-8b80-fade6640c678\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.030738 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48fdef5e-c65c-4898-af52-6ea141ab67b7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ppl5f\" (UID: \"48fdef5e-c65c-4898-af52-6ea141ab67b7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.030781 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zmp5c\" (UID: \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.030854 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48fdef5e-c65c-4898-af52-6ea141ab67b7-config\") pod \"kube-apiserver-operator-766d6c64bb-ppl5f\" (UID: \"48fdef5e-c65c-4898-af52-6ea141ab67b7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.030870 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zmp5c\" (UID: \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.030912 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/898668ee-3043-4bdc-8e77-82c108bcc65d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zgnq6\" (UID: \"898668ee-3043-4bdc-8e77-82c108bcc65d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.030946 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/898668ee-3043-4bdc-8e77-82c108bcc65d-config\") pod \"kube-controller-manager-operator-78b949d7b-zgnq6\" (UID: \"898668ee-3043-4bdc-8e77-82c108bcc65d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.030973 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7aeea6e8-f475-47dc-8b80-fade6640c678-images\") pod \"machine-config-operator-74547568cd-kj6k4\" (UID: \"7aeea6e8-f475-47dc-8b80-fade6640c678\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.031530 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48fdef5e-c65c-4898-af52-6ea141ab67b7-config\") pod \"kube-apiserver-operator-766d6c64bb-ppl5f\" (UID: \"48fdef5e-c65c-4898-af52-6ea141ab67b7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.031682 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7aeea6e8-f475-47dc-8b80-fade6640c678-images\") pod \"machine-config-operator-74547568cd-kj6k4\" (UID: \"7aeea6e8-f475-47dc-8b80-fade6640c678\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.031745 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/898668ee-3043-4bdc-8e77-82c108bcc65d-config\") pod \"kube-controller-manager-operator-78b949d7b-zgnq6\" (UID: \"898668ee-3043-4bdc-8e77-82c108bcc65d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.032034 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zmp5c\" (UID: \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.033888 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zmp5c\" (UID: \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.034096 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7aeea6e8-f475-47dc-8b80-fade6640c678-proxy-tls\") pod \"machine-config-operator-74547568cd-kj6k4\" (UID: \"7aeea6e8-f475-47dc-8b80-fade6640c678\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.034273 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48fdef5e-c65c-4898-af52-6ea141ab67b7-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-ppl5f\" (UID: \"48fdef5e-c65c-4898-af52-6ea141ab67b7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.034703 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/898668ee-3043-4bdc-8e77-82c108bcc65d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zgnq6\" (UID: \"898668ee-3043-4bdc-8e77-82c108bcc65d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.043745 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw2j4\" (UniqueName: \"kubernetes.io/projected/e7c61447-7ef6-408e-8034-46506b36b5d2-kube-api-access-tw2j4\") pod \"route-controller-manager-6576b87f9c-rtrwd\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.064921 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xksh4\" (UniqueName: \"kubernetes.io/projected/27cfa44e-65c6-48d1-bafb-ce4806fc043b-kube-api-access-xksh4\") pod \"console-operator-58897d9998-9bp54\" (UID: \"27cfa44e-65c6-48d1-bafb-ce4806fc043b\") " pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.080170 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.084228 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jk4l\" (UniqueName: \"kubernetes.io/projected/245aea2a-4167-418a-910c-91bf4836c8dc-kube-api-access-5jk4l\") pod \"oauth-openshift-558db77b4-nnx6h\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.105163 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4fx8\" (UniqueName: \"kubernetes.io/projected/728c9cea-9302-4856-95cc-2ea71352ec94-kube-api-access-k4fx8\") pod \"cluster-samples-operator-665b6dd947-fdqpk\" (UID: \"728c9cea-9302-4856-95cc-2ea71352ec94\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.117076 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.126294 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.132626 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8d287fb4-5d89-41ef-953d-92afcb5f33d3-srv-cert\") pod \"olm-operator-6b444d44fb-5mx8n\" (UID: \"8d287fb4-5d89-41ef-953d-92afcb5f33d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.132767 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11ba0a3-483c-4306-a89f-61f79a52b10d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jj9mw\" (UID: \"e11ba0a3-483c-4306-a89f-61f79a52b10d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jj9mw" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.136248 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e11ba0a3-483c-4306-a89f-61f79a52b10d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jj9mw\" (UID: \"e11ba0a3-483c-4306-a89f-61f79a52b10d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jj9mw" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.136286 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8d287fb4-5d89-41ef-953d-92afcb5f33d3-srv-cert\") pod \"olm-operator-6b444d44fb-5mx8n\" (UID: \"8d287fb4-5d89-41ef-953d-92afcb5f33d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.147000 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np4h5\" (UniqueName: \"kubernetes.io/projected/8c8ee9e7-2ff1-4be6-bd44-60193b7aed66-kube-api-access-np4h5\") pod \"openshift-apiserver-operator-796bbdcf4f-lpv9k\" (UID: \"8c8ee9e7-2ff1-4be6-bd44-60193b7aed66\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.165291 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tkg5\" (UniqueName: \"kubernetes.io/projected/99e63a17-8605-4830-96b4-dd619cf76549-kube-api-access-4tkg5\") pod \"controller-manager-879f6c89f-g9q5v\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.174753 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.194503 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.214716 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.227814 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd"] Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.233048 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 24 09:06:07 crc kubenswrapper[4563]: W1124 09:06:07.235405 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7c61447_7ef6_408e_8034_46506b36b5d2.slice/crio-1cb9ea1cb353326b34a8c8b43a54788bc63dfeba339982fc64587a588d8fa8d0 WatchSource:0}: Error finding container 1cb9ea1cb353326b34a8c8b43a54788bc63dfeba339982fc64587a588d8fa8d0: Status 404 returned error can't find the container with id 1cb9ea1cb353326b34a8c8b43a54788bc63dfeba339982fc64587a588d8fa8d0 Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.252820 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.253258 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nnx6h"] Nov 24 09:06:07 crc kubenswrapper[4563]: W1124 09:06:07.259671 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod245aea2a_4167_418a_910c_91bf4836c8dc.slice/crio-ff96cd001028e801d0699fc728f28fe7b7bba50740c32f1bacfc3dc72547a875 WatchSource:0}: Error finding container ff96cd001028e801d0699fc728f28fe7b7bba50740c32f1bacfc3dc72547a875: Status 404 returned error can't find the container with id ff96cd001028e801d0699fc728f28fe7b7bba50740c32f1bacfc3dc72547a875 Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.272335 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.292847 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.302094 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9bp54"] Nov 24 09:06:07 crc kubenswrapper[4563]: W1124 09:06:07.311285 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27cfa44e_65c6_48d1_bafb_ce4806fc043b.slice/crio-c808652839c507df5337e1754327b42f6b875421c246af46ab0d36fd061d39b6 WatchSource:0}: Error finding container c808652839c507df5337e1754327b42f6b875421c246af46ab0d36fd061d39b6: Status 404 returned error can't find the container with id c808652839c507df5337e1754327b42f6b875421c246af46ab0d36fd061d39b6 Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.312727 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.333364 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.351524 4563 request.go:700] Waited for 1.933030633s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/configmaps?fieldSelector=metadata.name%3Ddns-default&limit=500&resourceVersion=0 Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.352937 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.364039 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.373454 4563 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.377546 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.392897 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.405687 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.413462 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.447055 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twccj\" (UniqueName: \"kubernetes.io/projected/7aeea6e8-f475-47dc-8b80-fade6640c678-kube-api-access-twccj\") pod \"machine-config-operator-74547568cd-kj6k4\" (UID: \"7aeea6e8-f475-47dc-8b80-fade6640c678\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.468386 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48fdef5e-c65c-4898-af52-6ea141ab67b7-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-ppl5f\" (UID: \"48fdef5e-c65c-4898-af52-6ea141ab67b7\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.487111 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzr8v\" (UniqueName: \"kubernetes.io/projected/a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51-kube-api-access-fzr8v\") pod \"apiserver-76f77b778f-l6b8r\" (UID: \"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51\") " pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.508906 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs54w\" (UniqueName: \"kubernetes.io/projected/48157749-8872-4c5b-b119-efe27cfd887e-kube-api-access-xs54w\") pod \"machine-api-operator-5694c8668f-bxdlb\" (UID: \"48157749-8872-4c5b-b119-efe27cfd887e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.525094 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xnff\" (UniqueName: \"kubernetes.io/projected/b5a091e0-6549-4f21-a0dc-5f7452dc9c0f-kube-api-access-4xnff\") pod \"downloads-7954f5f757-tzmwx\" (UID: \"b5a091e0-6549-4f21-a0dc-5f7452dc9c0f\") " pod="openshift-console/downloads-7954f5f757-tzmwx" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.534233 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k"] Nov 24 09:06:07 crc kubenswrapper[4563]: W1124 09:06:07.540814 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c8ee9e7_2ff1_4be6_bd44_60193b7aed66.slice/crio-80cbd9b36f64957cca0651d26b81a3fcfd4758d8537b7f74cbd710b8c7f83073 WatchSource:0}: Error finding container 80cbd9b36f64957cca0651d26b81a3fcfd4758d8537b7f74cbd710b8c7f83073: Status 404 returned error can't find the container with id 80cbd9b36f64957cca0651d26b81a3fcfd4758d8537b7f74cbd710b8c7f83073 Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.545242 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g9q5v"] Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.545655 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxrqz\" (UniqueName: \"kubernetes.io/projected/2eb44334-ab46-4eca-a39f-7b289792b178-kube-api-access-pxrqz\") pod \"packageserver-d55dfcdfc-jv526\" (UID: \"2eb44334-ab46-4eca-a39f-7b289792b178\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.547083 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" Nov 24 09:06:07 crc kubenswrapper[4563]: W1124 09:06:07.552302 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99e63a17_8605_4830_96b4_dd619cf76549.slice/crio-642ca4d153567074abf47780a4f5a2029c2adaabb3592b271b1b3b0c3e9e6225 WatchSource:0}: Error finding container 642ca4d153567074abf47780a4f5a2029c2adaabb3592b271b1b3b0c3e9e6225: Status 404 returned error can't find the container with id 642ca4d153567074abf47780a4f5a2029c2adaabb3592b271b1b3b0c3e9e6225 Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.556742 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9bp54" event={"ID":"27cfa44e-65c6-48d1-bafb-ce4806fc043b","Type":"ContainerStarted","Data":"8dcd935f4a7a81239b41f9ab93f83ff418f64cff5a62759ef61c37b714bee7e9"} Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.556779 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9bp54" event={"ID":"27cfa44e-65c6-48d1-bafb-ce4806fc043b","Type":"ContainerStarted","Data":"c808652839c507df5337e1754327b42f6b875421c246af46ab0d36fd061d39b6"} Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.556983 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.558873 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" event={"ID":"e7c61447-7ef6-408e-8034-46506b36b5d2","Type":"ContainerStarted","Data":"29da0e72aa1a43c393446d564de1d3d45c9f96b46951dd175c5753cf31027e67"} Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.558906 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" event={"ID":"e7c61447-7ef6-408e-8034-46506b36b5d2","Type":"ContainerStarted","Data":"1cb9ea1cb353326b34a8c8b43a54788bc63dfeba339982fc64587a588d8fa8d0"} Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.559074 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.560173 4563 patch_prober.go:28] interesting pod/console-operator-58897d9998-9bp54 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.560200 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-9bp54" podUID="27cfa44e-65c6-48d1-bafb-ce4806fc043b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.561262 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k" event={"ID":"8c8ee9e7-2ff1-4be6-bd44-60193b7aed66","Type":"ContainerStarted","Data":"80cbd9b36f64957cca0651d26b81a3fcfd4758d8537b7f74cbd710b8c7f83073"} Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.562026 4563 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-rtrwd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.562048 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" podUID="e7c61447-7ef6-408e-8034-46506b36b5d2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.564729 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" event={"ID":"245aea2a-4167-418a-910c-91bf4836c8dc","Type":"ContainerStarted","Data":"00b57996bc38954c56b1780dec352c78b2db45b2ba61bcb2046a509c497d4c53"} Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.564758 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" event={"ID":"245aea2a-4167-418a-910c-91bf4836c8dc","Type":"ContainerStarted","Data":"ff96cd001028e801d0699fc728f28fe7b7bba50740c32f1bacfc3dc72547a875"} Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.564927 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.565547 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/898668ee-3043-4bdc-8e77-82c108bcc65d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zgnq6\" (UID: \"898668ee-3043-4bdc-8e77-82c108bcc65d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.566765 4563 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-nnx6h container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" start-of-body= Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.566808 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" podUID="245aea2a-4167-418a-910c-91bf4836c8dc" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.585109 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffvcw\" (UniqueName: \"kubernetes.io/projected/4e13a5b1-f9f7-4045-952a-a44cfd536a99-kube-api-access-ffvcw\") pod \"marketplace-operator-79b997595-zmp5c\" (UID: \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\") " pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.590587 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.606854 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h967v\" (UniqueName: \"kubernetes.io/projected/378c7d30-dd7c-4aa5-83cf-7caca587f283-kube-api-access-h967v\") pod \"control-plane-machine-set-operator-78cbb6b69f-z4lb4\" (UID: \"378c7d30-dd7c-4aa5-83cf-7caca587f283\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z4lb4" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.626416 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh8l6\" (UniqueName: \"kubernetes.io/projected/6f15c383-9eb5-4942-a63d-48e54beea23d-kube-api-access-rh8l6\") pod \"collect-profiles-29399580-sccd5\" (UID: \"6f15c383-9eb5-4942-a63d-48e54beea23d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.642493 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.643315 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdrch\" (UniqueName: \"kubernetes.io/projected/dcb29296-6b35-478f-9712-ef96d33867c2-kube-api-access-bdrch\") pod \"migrator-59844c95c7-vsmrn\" (UID: \"dcb29296-6b35-478f-9712-ef96d33867c2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsmrn" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.647105 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.665040 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqzqs\" (UniqueName: \"kubernetes.io/projected/20670dac-a915-49d2-8953-cb842980ca87-kube-api-access-fqzqs\") pod \"apiserver-7bbb656c7d-4zwrj\" (UID: \"20670dac-a915-49d2-8953-cb842980ca87\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.666684 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.671720 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.678041 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.685409 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07ac5545-11d9-470e-a4e0-1073333eebdc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wwmlv\" (UID: \"07ac5545-11d9-470e-a4e0-1073333eebdc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.708733 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzzfd\" (UniqueName: \"kubernetes.io/projected/6ca055d5-b576-4aa7-bcb2-138156414ff0-kube-api-access-gzzfd\") pod \"package-server-manager-789f6589d5-xjp4v\" (UID: \"6ca055d5-b576-4aa7-bcb2-138156414ff0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.715164 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bxdlb"] Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.724606 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgc69\" (UniqueName: \"kubernetes.io/projected/a1768f84-8c65-48c1-beb8-1309d1d2e823-kube-api-access-lgc69\") pod \"machine-approver-56656f9798-q4vst\" (UID: \"a1768f84-8c65-48c1-beb8-1309d1d2e823\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.747294 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526"] Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.753917 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.760890 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.770890 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xmpd\" (UniqueName: \"kubernetes.io/projected/b028b276-09cb-4d47-af70-1790128259df-kube-api-access-6xmpd\") pod \"service-ca-operator-777779d784-9b6t7\" (UID: \"b028b276-09cb-4d47-af70-1790128259df\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.779960 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.795828 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4dpq\" (UniqueName: \"kubernetes.io/projected/74b64da0-d6d7-44d6-9be9-d1120a019e02-kube-api-access-q4dpq\") pod \"openshift-controller-manager-operator-756b6f6bc6-54kkk\" (UID: \"74b64da0-d6d7-44d6-9be9-d1120a019e02\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.801989 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk"] Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.809569 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hrrj\" (UniqueName: \"kubernetes.io/projected/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-kube-api-access-5hrrj\") pod \"console-f9d7485db-7hx7w\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.814491 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-tzmwx" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.820254 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.824942 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.829107 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5"] Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.830918 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhslv\" (UniqueName: \"kubernetes.io/projected/e584f76f-d222-42b7-bfad-8190793ade5c-kube-api-access-jhslv\") pod \"authentication-operator-69f744f599-jqkcz\" (UID: \"e584f76f-d222-42b7-bfad-8190793ade5c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.842748 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:07 crc kubenswrapper[4563]: E1124 09:06:07.843134 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:08:09.843119072 +0000 UTC m=+267.102096520 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.843803 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.848222 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwp4h\" (UniqueName: \"kubernetes.io/projected/78e9b53f-4e52-4b94-8685-d5e84fb0b9cd-kube-api-access-lwp4h\") pod \"cluster-image-registry-operator-dc59b4c8b-92skl\" (UID: \"78e9b53f-4e52-4b94-8685-d5e84fb0b9cd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" Nov 24 09:06:07 crc kubenswrapper[4563]: W1124 09:06:07.858053 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f15c383_9eb5_4942_a63d_48e54beea23d.slice/crio-6773ada90c039f9e8576a8f046592ec013cec3d230d5a91b2f21e8d5b7e60f79 WatchSource:0}: Error finding container 6773ada90c039f9e8576a8f046592ec013cec3d230d5a91b2f21e8d5b7e60f79: Status 404 returned error can't find the container with id 6773ada90c039f9e8576a8f046592ec013cec3d230d5a91b2f21e8d5b7e60f79 Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.868778 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z4lb4" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.868924 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb7bc\" (UniqueName: \"kubernetes.io/projected/3d9a97b3-2f1d-4159-b128-a8e34f58f55b-kube-api-access-zb7bc\") pod \"kube-storage-version-migrator-operator-b67b599dd-6l6vs\" (UID: \"3d9a97b3-2f1d-4159-b128-a8e34f58f55b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.876458 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.882864 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsmrn" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.893082 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56kvf\" (UniqueName: \"kubernetes.io/projected/25f557a3-e5cd-4355-9e52-b7542c0103d2-kube-api-access-56kvf\") pod \"dns-operator-744455d44c-rshgn\" (UID: \"25f557a3-e5cd-4355-9e52-b7542c0103d2\") " pod="openshift-dns-operator/dns-operator-744455d44c-rshgn" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.894477 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.913941 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppqhd\" (UniqueName: \"kubernetes.io/projected/9350d335-c1d0-4315-9da0-75a6bd635efc-kube-api-access-ppqhd\") pod \"openshift-config-operator-7777fb866f-jmv44\" (UID: \"9350d335-c1d0-4315-9da0-75a6bd635efc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.917430 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmp5c"] Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.943730 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.943995 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.944027 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.944106 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.945277 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.947718 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbjxc\" (UniqueName: \"kubernetes.io/projected/28e525e8-966c-4b8a-b9bb-064bfb18b592-kube-api-access-zbjxc\") pod \"catalog-operator-68c6474976-jbgzc\" (UID: \"28e525e8-966c-4b8a-b9bb-064bfb18b592\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.948613 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfzs9\" (UniqueName: \"kubernetes.io/projected/fed196e5-1e64-4d16-b63f-297eac90a06d-kube-api-access-kfzs9\") pod \"router-default-5444994796-r5h7f\" (UID: \"fed196e5-1e64-4d16-b63f-297eac90a06d\") " pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.949425 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.949985 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.951337 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.969509 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6"] Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.974925 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78e9b53f-4e52-4b94-8685-d5e84fb0b9cd-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-92skl\" (UID: \"78e9b53f-4e52-4b94-8685-d5e84fb0b9cd\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" Nov 24 09:06:07 crc kubenswrapper[4563]: I1124 09:06:07.987138 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxlvj\" (UniqueName: \"kubernetes.io/projected/8d287fb4-5d89-41ef-953d-92afcb5f33d3-kube-api-access-jxlvj\") pod \"olm-operator-6b444d44fb-5mx8n\" (UID: \"8d287fb4-5d89-41ef-953d-92afcb5f33d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.013992 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-275wc\" (UniqueName: \"kubernetes.io/projected/e11ba0a3-483c-4306-a89f-61f79a52b10d-kube-api-access-275wc\") pod \"multus-admission-controller-857f4d67dd-jj9mw\" (UID: \"e11ba0a3-483c-4306-a89f-61f79a52b10d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jj9mw" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.033155 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9ph4\" (UniqueName: \"kubernetes.io/projected/3f63d03a-f0c4-486c-9e6e-eb29c84d228b-kube-api-access-l9ph4\") pod \"etcd-operator-b45778765-nsrtq\" (UID: \"3f63d03a-f0c4-486c-9e6e-eb29c84d228b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.042132 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v"] Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045516 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4952a751-3601-4381-9b92-d5d720b6dca2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045561 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/deb0f70e-c451-439a-afb4-c26f327638d9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vpxzg\" (UID: \"deb0f70e-c451-439a-afb4-c26f327638d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045592 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c4nk\" (UniqueName: \"kubernetes.io/projected/00de88df-032d-4a2a-aa57-db8080b919bf-kube-api-access-9c4nk\") pod \"machine-config-controller-84d6567774-thzh7\" (UID: \"00de88df-032d-4a2a-aa57-db8080b919bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045612 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00de88df-032d-4a2a-aa57-db8080b919bf-proxy-tls\") pod \"machine-config-controller-84d6567774-thzh7\" (UID: \"00de88df-032d-4a2a-aa57-db8080b919bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045650 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swqkx\" (UniqueName: \"kubernetes.io/projected/deb0f70e-c451-439a-afb4-c26f327638d9-kube-api-access-swqkx\") pod \"ingress-operator-5b745b69d9-vpxzg\" (UID: \"deb0f70e-c451-439a-afb4-c26f327638d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045670 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4952a751-3601-4381-9b92-d5d720b6dca2-trusted-ca\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045695 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-registry-tls\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045718 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/deb0f70e-c451-439a-afb4-c26f327638d9-trusted-ca\") pod \"ingress-operator-5b745b69d9-vpxzg\" (UID: \"deb0f70e-c451-439a-afb4-c26f327638d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045737 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0eba8762-dd47-4910-bae6-79802ac64ba8-signing-key\") pod \"service-ca-9c57cc56f-fd8lt\" (UID: \"0eba8762-dd47-4910-bae6-79802ac64ba8\") " pod="openshift-service-ca/service-ca-9c57cc56f-fd8lt" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045755 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmd6r\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-kube-api-access-pmd6r\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045778 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00de88df-032d-4a2a-aa57-db8080b919bf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-thzh7\" (UID: \"00de88df-032d-4a2a-aa57-db8080b919bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045800 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0eba8762-dd47-4910-bae6-79802ac64ba8-signing-cabundle\") pod \"service-ca-9c57cc56f-fd8lt\" (UID: \"0eba8762-dd47-4910-bae6-79802ac64ba8\") " pod="openshift-service-ca/service-ca-9c57cc56f-fd8lt" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045818 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-bound-sa-token\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045837 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4952a751-3601-4381-9b92-d5d720b6dca2-registry-certificates\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045857 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4952a751-3601-4381-9b92-d5d720b6dca2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045875 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdkc9\" (UniqueName: \"kubernetes.io/projected/0eba8762-dd47-4910-bae6-79802ac64ba8-kube-api-access-kdkc9\") pod \"service-ca-9c57cc56f-fd8lt\" (UID: \"0eba8762-dd47-4910-bae6-79802ac64ba8\") " pod="openshift-service-ca/service-ca-9c57cc56f-fd8lt" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045901 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.045936 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/deb0f70e-c451-439a-afb4-c26f327638d9-metrics-tls\") pod \"ingress-operator-5b745b69d9-vpxzg\" (UID: \"deb0f70e-c451-439a-afb4-c26f327638d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" Nov 24 09:06:08 crc kubenswrapper[4563]: E1124 09:06:08.046327 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:08.546313433 +0000 UTC m=+145.805290880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.076449 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.109368 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.129838 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.135010 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rshgn" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.144177 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4"] Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.145658 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f"] Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.146787 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:08 crc kubenswrapper[4563]: E1124 09:06:08.147019 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:08.647003452 +0000 UTC m=+145.905980899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147087 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147134 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zmhm\" (UniqueName: \"kubernetes.io/projected/b951c7e3-cc27-4f6f-8759-485c62c6d131-kube-api-access-6zmhm\") pod \"machine-config-server-59266\" (UID: \"b951c7e3-cc27-4f6f-8759-485c62c6d131\") " pod="openshift-machine-config-operator/machine-config-server-59266" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147163 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b951c7e3-cc27-4f6f-8759-485c62c6d131-certs\") pod \"machine-config-server-59266\" (UID: \"b951c7e3-cc27-4f6f-8759-485c62c6d131\") " pod="openshift-machine-config-operator/machine-config-server-59266" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147264 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-registration-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: E1124 09:06:08.147343 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:08.647336641 +0000 UTC m=+145.906314077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147369 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/deb0f70e-c451-439a-afb4-c26f327638d9-metrics-tls\") pod \"ingress-operator-5b745b69d9-vpxzg\" (UID: \"deb0f70e-c451-439a-afb4-c26f327638d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147491 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b951c7e3-cc27-4f6f-8759-485c62c6d131-node-bootstrap-token\") pod \"machine-config-server-59266\" (UID: \"b951c7e3-cc27-4f6f-8759-485c62c6d131\") " pod="openshift-machine-config-operator/machine-config-server-59266" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147536 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-plugins-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147605 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4952a751-3601-4381-9b92-d5d720b6dca2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147741 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/deb0f70e-c451-439a-afb4-c26f327638d9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vpxzg\" (UID: \"deb0f70e-c451-439a-afb4-c26f327638d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147760 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c4nk\" (UniqueName: \"kubernetes.io/projected/00de88df-032d-4a2a-aa57-db8080b919bf-kube-api-access-9c4nk\") pod \"machine-config-controller-84d6567774-thzh7\" (UID: \"00de88df-032d-4a2a-aa57-db8080b919bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147777 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00de88df-032d-4a2a-aa57-db8080b919bf-proxy-tls\") pod \"machine-config-controller-84d6567774-thzh7\" (UID: \"00de88df-032d-4a2a-aa57-db8080b919bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147823 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swqkx\" (UniqueName: \"kubernetes.io/projected/deb0f70e-c451-439a-afb4-c26f327638d9-kube-api-access-swqkx\") pod \"ingress-operator-5b745b69d9-vpxzg\" (UID: \"deb0f70e-c451-439a-afb4-c26f327638d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147862 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4952a751-3601-4381-9b92-d5d720b6dca2-trusted-ca\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147892 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-socket-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147912 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/daf55391-e211-42e2-9159-29a347fa220e-cert\") pod \"ingress-canary-wsk9h\" (UID: \"daf55391-e211-42e2-9159-29a347fa220e\") " pod="openshift-ingress-canary/ingress-canary-wsk9h" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.147972 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-mountpoint-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.148020 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-registry-tls\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.148090 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpzfn\" (UniqueName: \"kubernetes.io/projected/daf55391-e211-42e2-9159-29a347fa220e-kube-api-access-lpzfn\") pod \"ingress-canary-wsk9h\" (UID: \"daf55391-e211-42e2-9159-29a347fa220e\") " pod="openshift-ingress-canary/ingress-canary-wsk9h" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.148146 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vthx6\" (UniqueName: \"kubernetes.io/projected/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-kube-api-access-vthx6\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.148162 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd9fea55-0ca7-4676-b5b7-53231b66d601-config-volume\") pod \"dns-default-glp9q\" (UID: \"cd9fea55-0ca7-4676-b5b7-53231b66d601\") " pod="openshift-dns/dns-default-glp9q" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.148223 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/deb0f70e-c451-439a-afb4-c26f327638d9-trusted-ca\") pod \"ingress-operator-5b745b69d9-vpxzg\" (UID: \"deb0f70e-c451-439a-afb4-c26f327638d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.148270 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0eba8762-dd47-4910-bae6-79802ac64ba8-signing-key\") pod \"service-ca-9c57cc56f-fd8lt\" (UID: \"0eba8762-dd47-4910-bae6-79802ac64ba8\") " pod="openshift-service-ca/service-ca-9c57cc56f-fd8lt" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.148369 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmd6r\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-kube-api-access-pmd6r\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.148387 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-csi-data-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.148434 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00de88df-032d-4a2a-aa57-db8080b919bf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-thzh7\" (UID: \"00de88df-032d-4a2a-aa57-db8080b919bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.148477 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cd9fea55-0ca7-4676-b5b7-53231b66d601-metrics-tls\") pod \"dns-default-glp9q\" (UID: \"cd9fea55-0ca7-4676-b5b7-53231b66d601\") " pod="openshift-dns/dns-default-glp9q" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.150770 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0eba8762-dd47-4910-bae6-79802ac64ba8-signing-cabundle\") pod \"service-ca-9c57cc56f-fd8lt\" (UID: \"0eba8762-dd47-4910-bae6-79802ac64ba8\") " pod="openshift-service-ca/service-ca-9c57cc56f-fd8lt" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.150912 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-bound-sa-token\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.151855 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4952a751-3601-4381-9b92-d5d720b6dca2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.152292 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00de88df-032d-4a2a-aa57-db8080b919bf-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-thzh7\" (UID: \"00de88df-032d-4a2a-aa57-db8080b919bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.159897 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4952a751-3601-4381-9b92-d5d720b6dca2-trusted-ca\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.160729 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0eba8762-dd47-4910-bae6-79802ac64ba8-signing-key\") pod \"service-ca-9c57cc56f-fd8lt\" (UID: \"0eba8762-dd47-4910-bae6-79802ac64ba8\") " pod="openshift-service-ca/service-ca-9c57cc56f-fd8lt" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.170622 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-l6b8r"] Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.171210 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmd6r\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-kube-api-access-pmd6r\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.178885 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.180269 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/deb0f70e-c451-439a-afb4-c26f327638d9-metrics-tls\") pod \"ingress-operator-5b745b69d9-vpxzg\" (UID: \"deb0f70e-c451-439a-afb4-c26f327638d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.180716 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0eba8762-dd47-4910-bae6-79802ac64ba8-signing-cabundle\") pod \"service-ca-9c57cc56f-fd8lt\" (UID: \"0eba8762-dd47-4910-bae6-79802ac64ba8\") " pod="openshift-service-ca/service-ca-9c57cc56f-fd8lt" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.182022 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/deb0f70e-c451-439a-afb4-c26f327638d9-trusted-ca\") pod \"ingress-operator-5b745b69d9-vpxzg\" (UID: \"deb0f70e-c451-439a-afb4-c26f327638d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.182791 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.183103 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.183422 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.183839 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4952a751-3601-4381-9b92-d5d720b6dca2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.183989 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4952a751-3601-4381-9b92-d5d720b6dca2-registry-certificates\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.184034 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2bvx\" (UniqueName: \"kubernetes.io/projected/cd9fea55-0ca7-4676-b5b7-53231b66d601-kube-api-access-r2bvx\") pod \"dns-default-glp9q\" (UID: \"cd9fea55-0ca7-4676-b5b7-53231b66d601\") " pod="openshift-dns/dns-default-glp9q" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.185304 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4952a751-3601-4381-9b92-d5d720b6dca2-registry-certificates\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.186162 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-registry-tls\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.186213 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdkc9\" (UniqueName: \"kubernetes.io/projected/0eba8762-dd47-4910-bae6-79802ac64ba8-kube-api-access-kdkc9\") pod \"service-ca-9c57cc56f-fd8lt\" (UID: \"0eba8762-dd47-4910-bae6-79802ac64ba8\") " pod="openshift-service-ca/service-ca-9c57cc56f-fd8lt" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.186804 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.190674 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4952a751-3601-4381-9b92-d5d720b6dca2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.194268 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00de88df-032d-4a2a-aa57-db8080b919bf-proxy-tls\") pod \"machine-config-controller-84d6567774-thzh7\" (UID: \"00de88df-032d-4a2a-aa57-db8080b919bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.210004 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/deb0f70e-c451-439a-afb4-c26f327638d9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vpxzg\" (UID: \"deb0f70e-c451-439a-afb4-c26f327638d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.234026 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.234842 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c4nk\" (UniqueName: \"kubernetes.io/projected/00de88df-032d-4a2a-aa57-db8080b919bf-kube-api-access-9c4nk\") pod \"machine-config-controller-84d6567774-thzh7\" (UID: \"00de88df-032d-4a2a-aa57-db8080b919bf\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.237860 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.252007 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-bound-sa-token\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.257016 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.258772 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-jj9mw" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.266457 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swqkx\" (UniqueName: \"kubernetes.io/projected/deb0f70e-c451-439a-afb4-c26f327638d9-kube-api-access-swqkx\") pod \"ingress-operator-5b745b69d9-vpxzg\" (UID: \"deb0f70e-c451-439a-afb4-c26f327638d9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.283139 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.287984 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.288248 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zmhm\" (UniqueName: \"kubernetes.io/projected/b951c7e3-cc27-4f6f-8759-485c62c6d131-kube-api-access-6zmhm\") pod \"machine-config-server-59266\" (UID: \"b951c7e3-cc27-4f6f-8759-485c62c6d131\") " pod="openshift-machine-config-operator/machine-config-server-59266" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.288271 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b951c7e3-cc27-4f6f-8759-485c62c6d131-certs\") pod \"machine-config-server-59266\" (UID: \"b951c7e3-cc27-4f6f-8759-485c62c6d131\") " pod="openshift-machine-config-operator/machine-config-server-59266" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.288307 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-registration-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.288336 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b951c7e3-cc27-4f6f-8759-485c62c6d131-node-bootstrap-token\") pod \"machine-config-server-59266\" (UID: \"b951c7e3-cc27-4f6f-8759-485c62c6d131\") " pod="openshift-machine-config-operator/machine-config-server-59266" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.288352 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-plugins-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.288401 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-socket-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.288416 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/daf55391-e211-42e2-9159-29a347fa220e-cert\") pod \"ingress-canary-wsk9h\" (UID: \"daf55391-e211-42e2-9159-29a347fa220e\") " pod="openshift-ingress-canary/ingress-canary-wsk9h" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.288437 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-mountpoint-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.288460 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpzfn\" (UniqueName: \"kubernetes.io/projected/daf55391-e211-42e2-9159-29a347fa220e-kube-api-access-lpzfn\") pod \"ingress-canary-wsk9h\" (UID: \"daf55391-e211-42e2-9159-29a347fa220e\") " pod="openshift-ingress-canary/ingress-canary-wsk9h" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.288478 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vthx6\" (UniqueName: \"kubernetes.io/projected/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-kube-api-access-vthx6\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.288492 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd9fea55-0ca7-4676-b5b7-53231b66d601-config-volume\") pod \"dns-default-glp9q\" (UID: \"cd9fea55-0ca7-4676-b5b7-53231b66d601\") " pod="openshift-dns/dns-default-glp9q" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.288523 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-csi-data-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.288547 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cd9fea55-0ca7-4676-b5b7-53231b66d601-metrics-tls\") pod \"dns-default-glp9q\" (UID: \"cd9fea55-0ca7-4676-b5b7-53231b66d601\") " pod="openshift-dns/dns-default-glp9q" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.288594 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2bvx\" (UniqueName: \"kubernetes.io/projected/cd9fea55-0ca7-4676-b5b7-53231b66d601-kube-api-access-r2bvx\") pod \"dns-default-glp9q\" (UID: \"cd9fea55-0ca7-4676-b5b7-53231b66d601\") " pod="openshift-dns/dns-default-glp9q" Nov 24 09:06:08 crc kubenswrapper[4563]: E1124 09:06:08.288816 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:08.788797131 +0000 UTC m=+146.047774578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.290536 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-plugins-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.290596 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-registration-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.293332 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd9fea55-0ca7-4676-b5b7-53231b66d601-config-volume\") pod \"dns-default-glp9q\" (UID: \"cd9fea55-0ca7-4676-b5b7-53231b66d601\") " pod="openshift-dns/dns-default-glp9q" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.293715 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-csi-data-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.293861 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-socket-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.294546 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-mountpoint-dir\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.302442 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b951c7e3-cc27-4f6f-8759-485c62c6d131-certs\") pod \"machine-config-server-59266\" (UID: \"b951c7e3-cc27-4f6f-8759-485c62c6d131\") " pod="openshift-machine-config-operator/machine-config-server-59266" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.304330 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/daf55391-e211-42e2-9159-29a347fa220e-cert\") pod \"ingress-canary-wsk9h\" (UID: \"daf55391-e211-42e2-9159-29a347fa220e\") " pod="openshift-ingress-canary/ingress-canary-wsk9h" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.314224 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdkc9\" (UniqueName: \"kubernetes.io/projected/0eba8762-dd47-4910-bae6-79802ac64ba8-kube-api-access-kdkc9\") pod \"service-ca-9c57cc56f-fd8lt\" (UID: \"0eba8762-dd47-4910-bae6-79802ac64ba8\") " pod="openshift-service-ca/service-ca-9c57cc56f-fd8lt" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.318389 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b951c7e3-cc27-4f6f-8759-485c62c6d131-node-bootstrap-token\") pod \"machine-config-server-59266\" (UID: \"b951c7e3-cc27-4f6f-8759-485c62c6d131\") " pod="openshift-machine-config-operator/machine-config-server-59266" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.322014 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cd9fea55-0ca7-4676-b5b7-53231b66d601-metrics-tls\") pod \"dns-default-glp9q\" (UID: \"cd9fea55-0ca7-4676-b5b7-53231b66d601\") " pod="openshift-dns/dns-default-glp9q" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.332822 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2bvx\" (UniqueName: \"kubernetes.io/projected/cd9fea55-0ca7-4676-b5b7-53231b66d601-kube-api-access-r2bvx\") pod \"dns-default-glp9q\" (UID: \"cd9fea55-0ca7-4676-b5b7-53231b66d601\") " pod="openshift-dns/dns-default-glp9q" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.363446 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zmhm\" (UniqueName: \"kubernetes.io/projected/b951c7e3-cc27-4f6f-8759-485c62c6d131-kube-api-access-6zmhm\") pod \"machine-config-server-59266\" (UID: \"b951c7e3-cc27-4f6f-8759-485c62c6d131\") " pod="openshift-machine-config-operator/machine-config-server-59266" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.374191 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpzfn\" (UniqueName: \"kubernetes.io/projected/daf55391-e211-42e2-9159-29a347fa220e-kube-api-access-lpzfn\") pod \"ingress-canary-wsk9h\" (UID: \"daf55391-e211-42e2-9159-29a347fa220e\") " pod="openshift-ingress-canary/ingress-canary-wsk9h" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.390251 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: E1124 09:06:08.391299 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:08.891282366 +0000 UTC m=+146.150259814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.393166 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vthx6\" (UniqueName: \"kubernetes.io/projected/e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66-kube-api-access-vthx6\") pod \"csi-hostpathplugin-zq6xk\" (UID: \"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66\") " pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.492034 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:08 crc kubenswrapper[4563]: E1124 09:06:08.493331 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:08.993315259 +0000 UTC m=+146.252292705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.518433 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-fd8lt" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.523697 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.591174 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wsk9h" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.601384 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: E1124 09:06:08.601729 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:09.101715889 +0000 UTC m=+146.360693336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.602411 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-glp9q" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.603679 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-59266" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.617116 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.679606 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" event={"ID":"99e63a17-8605-4830-96b4-dd619cf76549","Type":"ContainerStarted","Data":"65d992faf76f9dded47291f2d921cf054d303707053bb4b4aa54580a19aebaac"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.684785 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.684805 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" event={"ID":"99e63a17-8605-4830-96b4-dd619cf76549","Type":"ContainerStarted","Data":"642ca4d153567074abf47780a4f5a2029c2adaabb3592b271b1b3b0c3e9e6225"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.689811 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" event={"ID":"898668ee-3043-4bdc-8e77-82c108bcc65d","Type":"ContainerStarted","Data":"5fee1a52ece49056b870bb97ec910ad95b1c595554f25c268dcd9042a4b62538"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.716985 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:08 crc kubenswrapper[4563]: E1124 09:06:08.717334 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:09.2173111 +0000 UTC m=+146.476288537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.730844 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" event={"ID":"a1768f84-8c65-48c1-beb8-1309d1d2e823","Type":"ContainerStarted","Data":"a4fb86952d5b63f301e5e27db1bb3a2310bf2484777a3fe653a38374f0aaff53"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.741956 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" event={"ID":"48fdef5e-c65c-4898-af52-6ea141ab67b7","Type":"ContainerStarted","Data":"9cc2c3e5d2a51eb0d48af045dd065a6c5a1369a880609ab3dab5c1ff5425be92"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.744385 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.750184 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk" event={"ID":"728c9cea-9302-4856-95cc-2ea71352ec94","Type":"ContainerStarted","Data":"c665f4899ef6bb4c7a416f6f38d0de8feed47e6fa85927d4bfd86401d0307e34"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.750218 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk" event={"ID":"728c9cea-9302-4856-95cc-2ea71352ec94","Type":"ContainerStarted","Data":"5490b43aa655beb737f567f315954fdc2dd993eacd302e8299da888749e95c73"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.758929 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r5h7f" event={"ID":"fed196e5-1e64-4d16-b63f-297eac90a06d","Type":"ContainerStarted","Data":"9d7939bea8abe3facd81976cb8c3933594997944134ba67734409e25be800db1"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.767171 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" event={"ID":"4e13a5b1-f9f7-4045-952a-a44cfd536a99","Type":"ContainerStarted","Data":"2888f7358774a3fccbfb2cfda7ab8cd22bbb1457aa038676a408e6e31aaa14e3"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.768093 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.768111 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" event={"ID":"4e13a5b1-f9f7-4045-952a-a44cfd536a99","Type":"ContainerStarted","Data":"1d83dd74979d0554ff7d7a41928c87f4cec5e72ae69b7286d0e7c94af6236873"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.769059 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" event={"ID":"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51","Type":"ContainerStarted","Data":"6503607e7cbb8af0f5d636e7dec64e604742849901486ad7b4a9fbe609b715f4"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.769150 4563 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zmp5c container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.769176 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" podUID="4e13a5b1-f9f7-4045-952a-a44cfd536a99" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.775542 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" event={"ID":"2eb44334-ab46-4eca-a39f-7b289792b178","Type":"ContainerStarted","Data":"0f810a55701ea99a73fe67013b0baa6c8fdd799819c2c3678288ba64f31f08cb"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.775594 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" event={"ID":"2eb44334-ab46-4eca-a39f-7b289792b178","Type":"ContainerStarted","Data":"374503a5f076b7bd03a49c0569eb659f874100b296cb86dbeb3f721e4f9ed70a"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.775835 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.797903 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" event={"ID":"7aeea6e8-f475-47dc-8b80-fade6640c678","Type":"ContainerStarted","Data":"099c71cf2b831c32c22624aa01999c7f32fe7e41b4b95561e5b1e36078845b69"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.801649 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k" event={"ID":"8c8ee9e7-2ff1-4be6-bd44-60193b7aed66","Type":"ContainerStarted","Data":"0b8264511af6ea04187fe0f14a748ca78fe02c7e6ce826cfee3f6f384c5b5e37"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.820500 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.820826 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" event={"ID":"6f15c383-9eb5-4942-a63d-48e54beea23d","Type":"ContainerStarted","Data":"6877ba89bd141ca57d4b33eee569ea5163b25c9d082bd9bb4644097c9de2b7f2"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.820865 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" event={"ID":"6f15c383-9eb5-4942-a63d-48e54beea23d","Type":"ContainerStarted","Data":"6773ada90c039f9e8576a8f046592ec013cec3d230d5a91b2f21e8d5b7e60f79"} Nov 24 09:06:08 crc kubenswrapper[4563]: E1124 09:06:08.821875 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:09.321862541 +0000 UTC m=+146.580839989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.824030 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v" event={"ID":"6ca055d5-b576-4aa7-bcb2-138156414ff0","Type":"ContainerStarted","Data":"54997cb1a6cff3cae692059cfa11cb593784d38e005d88788ef8eacef770824e"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.838416 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk"] Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.842237 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7hx7w"] Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.842848 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" event={"ID":"48157749-8872-4c5b-b119-efe27cfd887e","Type":"ContainerStarted","Data":"dfe74750dfc42bf9f595bc9d60e2e0a3538b190238e18715ec7dd950235c1be2"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.843459 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" event={"ID":"48157749-8872-4c5b-b119-efe27cfd887e","Type":"ContainerStarted","Data":"4e1f1b9c6e1cc757558ce59b46b83666b0d60b02abababadb19dbaf74230377e"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.843477 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" event={"ID":"48157749-8872-4c5b-b119-efe27cfd887e","Type":"ContainerStarted","Data":"5ac3217b957c1b4e70795abece472365182a04a4e61e67db3db77e9e63e56ed2"} Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.850809 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.851676 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-9bp54" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.928088 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" podStartSLOduration=123.928074234 podStartE2EDuration="2m3.928074234s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:08.926457225 +0000 UTC m=+146.185434672" watchObservedRunningTime="2025-11-24 09:06:08.928074234 +0000 UTC m=+146.187051681" Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.928224 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:08 crc kubenswrapper[4563]: E1124 09:06:08.929559 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:09.429544196 +0000 UTC m=+146.688521643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.993057 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:06:08 crc kubenswrapper[4563]: I1124 09:06:08.993281 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.040492 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.055804 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:09 crc kubenswrapper[4563]: E1124 09:06:09.059205 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:09.556234341 +0000 UTC m=+146.815211789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.161955 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:09 crc kubenswrapper[4563]: E1124 09:06:09.162249 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:09.662235989 +0000 UTC m=+146.921213435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.240238 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jv526" Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.240496 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vsmrn"] Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.240511 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-tzmwx"] Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.240521 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z4lb4"] Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.240531 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj"] Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.266063 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:09 crc kubenswrapper[4563]: E1124 09:06:09.266353 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:09.76634236 +0000 UTC m=+147.025319807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.333166 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" podStartSLOduration=124.333149637 podStartE2EDuration="2m4.333149637s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:09.327123953 +0000 UTC m=+146.586101401" watchObservedRunningTime="2025-11-24 09:06:09.333149637 +0000 UTC m=+146.592127084" Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.378231 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:09 crc kubenswrapper[4563]: E1124 09:06:09.378657 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:09.87862856 +0000 UTC m=+147.137606007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:09 crc kubenswrapper[4563]: W1124 09:06:09.455277 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20670dac_a915_49d2_8953_cb842980ca87.slice/crio-04cc9d8b3ff32bf49eba6cbbade53577926abf598f3a6fd5c0d1d5854cd2f50d WatchSource:0}: Error finding container 04cc9d8b3ff32bf49eba6cbbade53577926abf598f3a6fd5c0d1d5854cd2f50d: Status 404 returned error can't find the container with id 04cc9d8b3ff32bf49eba6cbbade53577926abf598f3a6fd5c0d1d5854cd2f50d Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.479741 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:09 crc kubenswrapper[4563]: E1124 09:06:09.480361 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:09.980349043 +0000 UTC m=+147.239326490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.582161 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:09 crc kubenswrapper[4563]: E1124 09:06:09.582889 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:10.082867881 +0000 UTC m=+147.341845328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.602624 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-bxdlb" podStartSLOduration=124.602605396 podStartE2EDuration="2m4.602605396s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:09.563572881 +0000 UTC m=+146.822550328" watchObservedRunningTime="2025-11-24 09:06:09.602605396 +0000 UTC m=+146.861582843" Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.603767 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" podStartSLOduration=124.603755645 podStartE2EDuration="2m4.603755645s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:09.59998868 +0000 UTC m=+146.858966127" watchObservedRunningTime="2025-11-24 09:06:09.603755645 +0000 UTC m=+146.862733091" Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.693457 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:09 crc kubenswrapper[4563]: E1124 09:06:09.716480 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:10.216460604 +0000 UTC m=+147.475438050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.737963 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" podStartSLOduration=124.737618598 podStartE2EDuration="2m4.737618598s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:09.73465028 +0000 UTC m=+146.993627727" watchObservedRunningTime="2025-11-24 09:06:09.737618598 +0000 UTC m=+146.996596045" Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.778020 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-9bp54" podStartSLOduration=124.778004805 podStartE2EDuration="2m4.778004805s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:09.776493315 +0000 UTC m=+147.035470762" watchObservedRunningTime="2025-11-24 09:06:09.778004805 +0000 UTC m=+147.036982252" Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.796883 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:09 crc kubenswrapper[4563]: E1124 09:06:09.796997 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:10.296977217 +0000 UTC m=+147.555954663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.797089 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:09 crc kubenswrapper[4563]: E1124 09:06:09.798976 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:10.298963743 +0000 UTC m=+147.557941190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.817519 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" podStartSLOduration=124.817498869 podStartE2EDuration="2m4.817498869s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:09.816617067 +0000 UTC m=+147.075594514" watchObservedRunningTime="2025-11-24 09:06:09.817498869 +0000 UTC m=+147.076476316" Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.832415 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs"] Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.872020 4563 generic.go:334] "Generic (PLEG): container finished" podID="a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51" containerID="ff9d9562193d83cb06d20cb0cc570c9fe403d59e1c3c2d04a9d6985c83630787" exitCode=0 Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.872099 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" event={"ID":"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51","Type":"ContainerDied","Data":"ff9d9562193d83cb06d20cb0cc570c9fe403d59e1c3c2d04a9d6985c83630787"} Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.887551 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lpv9k" podStartSLOduration=124.887536811 podStartE2EDuration="2m4.887536811s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:09.886819979 +0000 UTC m=+147.145797425" watchObservedRunningTime="2025-11-24 09:06:09.887536811 +0000 UTC m=+147.146514257" Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.897565 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:09 crc kubenswrapper[4563]: E1124 09:06:09.897851 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:10.397837237 +0000 UTC m=+147.656814683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.898418 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" event={"ID":"48fdef5e-c65c-4898-af52-6ea141ab67b7","Type":"ContainerStarted","Data":"6d3e2bff7b490917d58cac9e7f89146d4f31e89dd1bd48970fff948104d720f6"} Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.911529 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nsrtq"] Nov 24 09:06:09 crc kubenswrapper[4563]: W1124 09:06:09.931211 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-214893e40e0899be1a0f95f83349d82e17764934a04c87b3ed317215e09a38a0 WatchSource:0}: Error finding container 214893e40e0899be1a0f95f83349d82e17764934a04c87b3ed317215e09a38a0: Status 404 returned error can't find the container with id 214893e40e0899be1a0f95f83349d82e17764934a04c87b3ed317215e09a38a0 Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.937257 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7"] Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.938704 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jmv44"] Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.948713 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rshgn"] Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.949496 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl"] Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.956976 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" podStartSLOduration=124.956960611 podStartE2EDuration="2m4.956960611s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:09.945314307 +0000 UTC m=+147.204291754" watchObservedRunningTime="2025-11-24 09:06:09.956960611 +0000 UTC m=+147.215938059" Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.960590 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v" event={"ID":"6ca055d5-b576-4aa7-bcb2-138156414ff0","Type":"ContainerStarted","Data":"5b9ab24a7ca5d3c96a82bdff0b0229854518bdf85e401cc0f28a60aaff8fa241"} Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.960665 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v" event={"ID":"6ca055d5-b576-4aa7-bcb2-138156414ff0","Type":"ContainerStarted","Data":"2c210cdaf139b894ec3a860ab08b2252f24a5fd2fde1db1a7f56a9ecb498794e"} Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.961853 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v" Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.962576 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv"] Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.994745 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" event={"ID":"7aeea6e8-f475-47dc-8b80-fade6640c678","Type":"ContainerStarted","Data":"e2dc2120c1b7bbbb3258520c9b9e961b532a4d518b3df007a36eea3ab5818f64"} Nov 24 09:06:09 crc kubenswrapper[4563]: I1124 09:06:09.994785 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" event={"ID":"7aeea6e8-f475-47dc-8b80-fade6640c678","Type":"ContainerStarted","Data":"ede797065b65b8c4a958f4f292cb0b198e7e3630cd2d771f8dd847432540bff8"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.000826 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.001953 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:10.501940763 +0000 UTC m=+147.760918210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.013521 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" event={"ID":"898668ee-3043-4bdc-8e77-82c108bcc65d","Type":"ContainerStarted","Data":"8a038829b89af7f8629752c9902a3fc4318d4fe99342816b3061f175684044d0"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.058275 4563 generic.go:334] "Generic (PLEG): container finished" podID="6f15c383-9eb5-4942-a63d-48e54beea23d" containerID="6877ba89bd141ca57d4b33eee569ea5163b25c9d082bd9bb4644097c9de2b7f2" exitCode=0 Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.058791 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jj9mw"] Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.058819 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" event={"ID":"6f15c383-9eb5-4942-a63d-48e54beea23d","Type":"ContainerDied","Data":"6877ba89bd141ca57d4b33eee569ea5163b25c9d082bd9bb4644097c9de2b7f2"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.072760 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fd8lt"] Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.078044 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jqkcz"] Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.086252 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk" event={"ID":"728c9cea-9302-4856-95cc-2ea71352ec94","Type":"ContainerStarted","Data":"78df1a6eab5557da25a170f106245b22b787b62cbdb9c5b3e6d68b6391d3f8c1"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.099108 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tzmwx" event={"ID":"b5a091e0-6549-4f21-a0dc-5f7452dc9c0f","Type":"ContainerStarted","Data":"1d63c2e8560b5997be2d4e725f93dbadc785046026a758297f3149978a493dab"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.099143 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-tzmwx" event={"ID":"b5a091e0-6549-4f21-a0dc-5f7452dc9c0f","Type":"ContainerStarted","Data":"eeaa7cf791aee64dacaf5af6426bbea56b894223c78ffa9e4d3a1d0a2def09bc"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.099813 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-tzmwx" Nov 24 09:06:10 crc kubenswrapper[4563]: W1124 09:06:10.101530 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-cbd6eb4c2e5cf3d411141e672b991255a318d9d0e546ecd4e6cddce58154deb7 WatchSource:0}: Error finding container cbd6eb4c2e5cf3d411141e672b991255a318d9d0e546ecd4e6cddce58154deb7: Status 404 returned error can't find the container with id cbd6eb4c2e5cf3d411141e672b991255a318d9d0e546ecd4e6cddce58154deb7 Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.101544 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.101623 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:10.601609917 +0000 UTC m=+147.860587364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.101938 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.105274 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:10.605261113 +0000 UTC m=+147.864238560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.105625 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r5h7f" event={"ID":"fed196e5-1e64-4d16-b63f-297eac90a06d","Type":"ContainerStarted","Data":"dc20200d9a4d22bdfa9d32eba63e6ac4dd781ce089dac41cf7efd19ea31b8705"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.112744 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-glp9q"] Nov 24 09:06:10 crc kubenswrapper[4563]: W1124 09:06:10.113817 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78e9b53f_4e52_4b94_8685_d5e84fb0b9cd.slice/crio-5f7993f2d540fd8108f5b4c02b07706b6cafaa10e63c910de89cbc3f5f8f0b3d WatchSource:0}: Error finding container 5f7993f2d540fd8108f5b4c02b07706b6cafaa10e63c910de89cbc3f5f8f0b3d: Status 404 returned error can't find the container with id 5f7993f2d540fd8108f5b4c02b07706b6cafaa10e63c910de89cbc3f5f8f0b3d Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.117810 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z4lb4" event={"ID":"378c7d30-dd7c-4aa5-83cf-7caca587f283","Type":"ContainerStarted","Data":"de20923b90561f0a5acb3bb3092592c0297bcaf5a4deec0f734386da9572e5bd"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.117843 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z4lb4" event={"ID":"378c7d30-dd7c-4aa5-83cf-7caca587f283","Type":"ContainerStarted","Data":"781fcb61f4328ed1892827847a7e6da513ebe7219311fbdcaf403934e663310b"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.125383 4563 patch_prober.go:28] interesting pod/downloads-7954f5f757-tzmwx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.125415 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tzmwx" podUID="b5a091e0-6549-4f21-a0dc-5f7452dc9c0f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.136345 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg"] Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.145222 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7hx7w" event={"ID":"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28","Type":"ContainerStarted","Data":"c8250f0446ef98d65787e3a58b19b297a82346cd3ed13878e276419ecf3ba3da"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.145270 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7hx7w" event={"ID":"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28","Type":"ContainerStarted","Data":"92799b1e34eeeee51c6e4aaaa6b0b28eae0a13ca48fff82f24d4c1dd82f733f7"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.202703 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-zq6xk"] Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.202816 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kj6k4" podStartSLOduration=125.202800631 podStartE2EDuration="2m5.202800631s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:10.197705811 +0000 UTC m=+147.456683459" watchObservedRunningTime="2025-11-24 09:06:10.202800631 +0000 UTC m=+147.461778077" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.203326 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.203678 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:10.703650964 +0000 UTC m=+147.962628411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.203810 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.204427 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:10.70441274 +0000 UTC m=+147.963390187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.212346 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk" event={"ID":"74b64da0-d6d7-44d6-9be9-d1120a019e02","Type":"ContainerStarted","Data":"33c19c2ea7bbd00b09e0eb6ae7c5c4ff60b23b6c69736a26a70c2806a25d82e7"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.212400 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk" event={"ID":"74b64da0-d6d7-44d6-9be9-d1120a019e02","Type":"ContainerStarted","Data":"def9cc09c06a0b67330f6640092573cb803c8f364f4466937774e0cd1c03b5d1"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.237123 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" event={"ID":"a1768f84-8c65-48c1-beb8-1309d1d2e823","Type":"ContainerStarted","Data":"cd766cd80de8235cd9ae9c90007e9208b97cbd7e0e5192522c6e98dad14fdf14"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.237170 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" event={"ID":"a1768f84-8c65-48c1-beb8-1309d1d2e823","Type":"ContainerStarted","Data":"d54bdab6415c8b23cc6d2dc159a22c89a6be84845a672e27c5a6776a40d0d52e"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.240035 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.250837 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:10 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:10 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:10 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.250877 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.271444 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsmrn" event={"ID":"dcb29296-6b35-478f-9712-ef96d33867c2","Type":"ContainerStarted","Data":"a8c237c394018d591143a24f31cae834748f0c8184b4843ed80e524739067f6f"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.271476 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsmrn" event={"ID":"dcb29296-6b35-478f-9712-ef96d33867c2","Type":"ContainerStarted","Data":"3e7833f03c49ad23ec352361922ab67ee4ae92cbecc332c18f2075ed9584e18b"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.298183 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n"] Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.309063 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" event={"ID":"20670dac-a915-49d2-8953-cb842980ca87","Type":"ContainerStarted","Data":"04cc9d8b3ff32bf49eba6cbbade53577926abf598f3a6fd5c0d1d5854cd2f50d"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.311545 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.311832 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:10.811811874 +0000 UTC m=+148.070789321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.330591 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v" podStartSLOduration=125.330568628 podStartE2EDuration="2m5.330568628s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:10.330280736 +0000 UTC m=+147.589258182" watchObservedRunningTime="2025-11-24 09:06:10.330568628 +0000 UTC m=+147.589546075" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.334119 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7"] Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.340730 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc"] Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.364119 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wsk9h"] Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.375504 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-59266" event={"ID":"b951c7e3-cc27-4f6f-8759-485c62c6d131","Type":"ContainerStarted","Data":"58695e5bcbfe80761f84de560c5c8026c46f9473134a762a85d3bb57bf11bbd7"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.375547 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-59266" event={"ID":"b951c7e3-cc27-4f6f-8759-485c62c6d131","Type":"ContainerStarted","Data":"08a8091eb542b2bf726c1db6606a7f18a1ccf1771b240ecf25fbd899cd66ca62"} Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.378047 4563 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zmp5c container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.378082 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" podUID="4e13a5b1-f9f7-4045-952a-a44cfd536a99" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.415623 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.417131 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zgnq6" podStartSLOduration=125.417120603 podStartE2EDuration="2m5.417120603s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:10.416305276 +0000 UTC m=+147.675282723" watchObservedRunningTime="2025-11-24 09:06:10.417120603 +0000 UTC m=+147.676098050" Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.420339 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:10.920328523 +0000 UTC m=+148.179305970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.456837 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-ppl5f" podStartSLOduration=125.456820467 podStartE2EDuration="2m5.456820467s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:10.456190488 +0000 UTC m=+147.715167936" watchObservedRunningTime="2025-11-24 09:06:10.456820467 +0000 UTC m=+147.715797913" Nov 24 09:06:10 crc kubenswrapper[4563]: W1124 09:06:10.466908 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28e525e8_966c_4b8a_b9bb_064bfb18b592.slice/crio-b496769e5a67b099357144baa53c6f883889e1a65141a205231c3f2b96b8e2ec WatchSource:0}: Error finding container b496769e5a67b099357144baa53c6f883889e1a65141a205231c3f2b96b8e2ec: Status 404 returned error can't find the container with id b496769e5a67b099357144baa53c6f883889e1a65141a205231c3f2b96b8e2ec Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.519564 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-54kkk" podStartSLOduration=125.519544283 podStartE2EDuration="2m5.519544283s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:10.489919741 +0000 UTC m=+147.748897189" watchObservedRunningTime="2025-11-24 09:06:10.519544283 +0000 UTC m=+147.778521730" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.521414 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.522869 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:11.022849867 +0000 UTC m=+148.281827314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.559323 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-r5h7f" podStartSLOduration=125.559303829 podStartE2EDuration="2m5.559303829s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:10.520797165 +0000 UTC m=+147.779774612" watchObservedRunningTime="2025-11-24 09:06:10.559303829 +0000 UTC m=+147.818281275" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.587425 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-q4vst" podStartSLOduration=125.587406038 podStartE2EDuration="2m5.587406038s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:10.563465286 +0000 UTC m=+147.822442734" watchObservedRunningTime="2025-11-24 09:06:10.587406038 +0000 UTC m=+147.846383485" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.587539 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-59266" podStartSLOduration=5.58753456 podStartE2EDuration="5.58753456s" podCreationTimestamp="2025-11-24 09:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:10.587176797 +0000 UTC m=+147.846154243" watchObservedRunningTime="2025-11-24 09:06:10.58753456 +0000 UTC m=+147.846512007" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.608562 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-7hx7w" podStartSLOduration=125.608545978 podStartE2EDuration="2m5.608545978s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:10.607981142 +0000 UTC m=+147.866958589" watchObservedRunningTime="2025-11-24 09:06:10.608545978 +0000 UTC m=+147.867523424" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.623267 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.623619 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:11.123608255 +0000 UTC m=+148.382585702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.684153 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-z4lb4" podStartSLOduration=125.6841368 podStartE2EDuration="2m5.6841368s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:10.683251802 +0000 UTC m=+147.942229249" watchObservedRunningTime="2025-11-24 09:06:10.6841368 +0000 UTC m=+147.943114247" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.722386 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-tzmwx" podStartSLOduration=125.722367944 podStartE2EDuration="2m5.722367944s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:10.7208866 +0000 UTC m=+147.979864047" watchObservedRunningTime="2025-11-24 09:06:10.722367944 +0000 UTC m=+147.981345391" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.725156 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.725258 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:11.225240451 +0000 UTC m=+148.484217898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.725409 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.725729 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:11.225717831 +0000 UTC m=+148.484695278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.762155 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsmrn" podStartSLOduration=125.762139802 podStartE2EDuration="2m5.762139802s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:10.760107469 +0000 UTC m=+148.019084917" watchObservedRunningTime="2025-11-24 09:06:10.762139802 +0000 UTC m=+148.021117249" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.809960 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fdqpk" podStartSLOduration=125.809937427 podStartE2EDuration="2m5.809937427s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:10.806207873 +0000 UTC m=+148.065185320" watchObservedRunningTime="2025-11-24 09:06:10.809937427 +0000 UTC m=+148.068914875" Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.827067 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.827318 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:11.327297759 +0000 UTC m=+148.586275206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.827420 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.827896 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:11.327890537 +0000 UTC m=+148.586867984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.929165 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.929528 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:11.429507684 +0000 UTC m=+148.688485132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:10 crc kubenswrapper[4563]: I1124 09:06:10.929605 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:10 crc kubenswrapper[4563]: E1124 09:06:10.929881 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:11.429873435 +0000 UTC m=+148.688850881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.030606 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:11 crc kubenswrapper[4563]: E1124 09:06:11.030962 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:11.53094883 +0000 UTC m=+148.789926277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.131983 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:11 crc kubenswrapper[4563]: E1124 09:06:11.138769 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:11.63874946 +0000 UTC m=+148.897726907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.234177 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:11 crc kubenswrapper[4563]: E1124 09:06:11.234560 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:11.734546411 +0000 UTC m=+148.993523858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.243594 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:11 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:11 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:11 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.243671 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.335307 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:11 crc kubenswrapper[4563]: E1124 09:06:11.335788 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:11.835774495 +0000 UTC m=+149.094751942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.408037 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jj9mw" event={"ID":"e11ba0a3-483c-4306-a89f-61f79a52b10d","Type":"ContainerStarted","Data":"2ec13b67db3d1ee3ab39db0d6990a39ab7ffe12ca281c3be7e1be0e95518d248"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.408083 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jj9mw" event={"ID":"e11ba0a3-483c-4306-a89f-61f79a52b10d","Type":"ContainerStarted","Data":"3fc3450d5ac7ffe0014a8590ff3c7494590489be23f6e31a4f6ec7772218770b"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.414992 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"86015cffb3b150510a2bd93eb767d19cb5d65ed6e8747a12708f289bb680d0c5"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.415031 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"cbd6eb4c2e5cf3d411141e672b991255a318d9d0e546ecd4e6cddce58154deb7"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.417831 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-fd8lt" event={"ID":"0eba8762-dd47-4910-bae6-79802ac64ba8","Type":"ContainerStarted","Data":"98660500f3295f8eb85aa6e8c009e0312de1335ee1b34c77b2f09af2f95292bf"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.417893 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-fd8lt" event={"ID":"0eba8762-dd47-4910-bae6-79802ac64ba8","Type":"ContainerStarted","Data":"280eed5ebd23faf15ea9c3f1a82a3c7a2b6a4e46e304b068b2ceb72663bee67f"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.433907 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vsmrn" event={"ID":"dcb29296-6b35-478f-9712-ef96d33867c2","Type":"ContainerStarted","Data":"dfbb648875d71bc00493c07e820441c40b45b476f79074750558ed9eecf070ae"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.437269 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:11 crc kubenswrapper[4563]: E1124 09:06:11.437581 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:11.937568336 +0000 UTC m=+149.196545782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.441885 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv" event={"ID":"07ac5545-11d9-470e-a4e0-1073333eebdc","Type":"ContainerStarted","Data":"465c1e05143a929c7b105de6cda193f1ff90e255e72f74cd25722be960ee1f51"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.441919 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv" event={"ID":"07ac5545-11d9-470e-a4e0-1073333eebdc","Type":"ContainerStarted","Data":"ef4f6d1f5c72d324b1f4ee3f364376e27aaf498baafb4351a6d5efa6ec6a2ce9"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.446946 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" event={"ID":"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66","Type":"ContainerStarted","Data":"aaa15d489bc5a5f61620f3cf57e7d121d6592547cb77fe95e6230193789c8c60"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.452866 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"dd87b1e1ecee0963c1520f7a4bc5c49ccff8e40bdc3f34cb98d2040a067d0b05"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.453168 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"214893e40e0899be1a0f95f83349d82e17764934a04c87b3ed317215e09a38a0"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.453529 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.456273 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-fd8lt" podStartSLOduration=126.456261 podStartE2EDuration="2m6.456261s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:11.451536189 +0000 UTC m=+148.710513636" watchObservedRunningTime="2025-11-24 09:06:11.456261 +0000 UTC m=+148.715238447" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.481431 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" event={"ID":"8d287fb4-5d89-41ef-953d-92afcb5f33d3","Type":"ContainerStarted","Data":"1613dac08a0c3b80f73c138d69f56938cb8222f1fa432bc2c42b553991ac79b1"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.481476 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" event={"ID":"8d287fb4-5d89-41ef-953d-92afcb5f33d3","Type":"ContainerStarted","Data":"8fed70113062e43e4f4559674bbf4af7a88648c863f384ab2f2c7f9a65bbe118"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.503057 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.503891 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wwmlv" podStartSLOduration=126.503877234 podStartE2EDuration="2m6.503877234s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:11.480671147 +0000 UTC m=+148.739648594" watchObservedRunningTime="2025-11-24 09:06:11.503877234 +0000 UTC m=+148.762854680" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.510096 4563 generic.go:334] "Generic (PLEG): container finished" podID="9350d335-c1d0-4315-9da0-75a6bd635efc" containerID="75fcd651b4e230327966b328251d5472121198b950ecfe7761e20d3fcc0062f6" exitCode=0 Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.510225 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" event={"ID":"9350d335-c1d0-4315-9da0-75a6bd635efc","Type":"ContainerDied","Data":"75fcd651b4e230327966b328251d5472121198b950ecfe7761e20d3fcc0062f6"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.510256 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" event={"ID":"9350d335-c1d0-4315-9da0-75a6bd635efc","Type":"ContainerStarted","Data":"0f20d022bf77fabf0fc953507c67f42330f2ab91a1a40877fc34cd0286d8685b"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.513343 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2b0be30bafdc8b0c4ceba4afefcc290425c18c9f53b1a64d6b01efd782b93cb4"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.513560 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"236b8cac49c217bd233f8de3fff08de7083456eb49aa430dd2d97b1087b36c40"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.519601 4563 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-5mx8n container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.519720 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" podUID="8d287fb4-5d89-41ef-953d-92afcb5f33d3" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.522405 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" event={"ID":"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51","Type":"ContainerStarted","Data":"eba5830b829546e967c0ddbc0a9a59a59b42a81df7c46d0a39f948d3c814dc55"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.533394 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" event={"ID":"3f63d03a-f0c4-486c-9e6e-eb29c84d228b","Type":"ContainerStarted","Data":"c92528a37606de09811362ea329b98b4542446a8877634ed66639b8c31398acb"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.533435 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" event={"ID":"3f63d03a-f0c4-486c-9e6e-eb29c84d228b","Type":"ContainerStarted","Data":"699f134139293d4f32486eee0612ba4da1aa02dc6ac06cd37336ff282008b9d9"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.540161 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:11 crc kubenswrapper[4563]: E1124 09:06:11.540483 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:12.04046662 +0000 UTC m=+149.299444067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.547870 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" podStartSLOduration=126.547854313 podStartE2EDuration="2m6.547854313s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:11.545232268 +0000 UTC m=+148.804209714" watchObservedRunningTime="2025-11-24 09:06:11.547854313 +0000 UTC m=+148.806831760" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.559786 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rshgn" event={"ID":"25f557a3-e5cd-4355-9e52-b7542c0103d2","Type":"ContainerStarted","Data":"ade35c90d4dde0a51b97317f88bb346349b146ba76ce59986d669937b2e3a323"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.559827 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rshgn" event={"ID":"25f557a3-e5cd-4355-9e52-b7542c0103d2","Type":"ContainerStarted","Data":"03f442f570abe32b1885b2d001e01980e1f76e84166315412253a55159cf6fea"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.573314 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" event={"ID":"28e525e8-966c-4b8a-b9bb-064bfb18b592","Type":"ContainerStarted","Data":"47b22b697f5947451b56f7e44503acca8ed0d758a7412b32fe7653d4fd6cd2ca"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.573444 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" event={"ID":"28e525e8-966c-4b8a-b9bb-064bfb18b592","Type":"ContainerStarted","Data":"b496769e5a67b099357144baa53c6f883889e1a65141a205231c3f2b96b8e2ec"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.574353 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.592740 4563 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-jbgzc container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.592781 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" podUID="28e525e8-966c-4b8a-b9bb-064bfb18b592" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.599952 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" podStartSLOduration=126.599930488 podStartE2EDuration="2m6.599930488s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:11.575083677 +0000 UTC m=+148.834061114" watchObservedRunningTime="2025-11-24 09:06:11.599930488 +0000 UTC m=+148.858907935" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.603009 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" event={"ID":"00de88df-032d-4a2a-aa57-db8080b919bf","Type":"ContainerStarted","Data":"a379361f561d82658d5a6f0674e5972a2c40611c70fabb03836cbd39dfea2238"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.603161 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" event={"ID":"00de88df-032d-4a2a-aa57-db8080b919bf","Type":"ContainerStarted","Data":"8802f22434eab2c139e5c3aaa40e496f94e6330ebaf845ec6a251c35219d612b"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.618715 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs" event={"ID":"3d9a97b3-2f1d-4159-b128-a8e34f58f55b","Type":"ContainerStarted","Data":"52dea5fc1b6652bf84d32699359ddb3adfc8c82f7c89dd0b41704dfb3aba1bd5"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.618965 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs" event={"ID":"3d9a97b3-2f1d-4159-b128-a8e34f58f55b","Type":"ContainerStarted","Data":"f8aee98ad2f53fed23ea4c176489c5bc61a844ba7c011283b612f7cd2323ad9f"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.635994 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" event={"ID":"78e9b53f-4e52-4b94-8685-d5e84fb0b9cd","Type":"ContainerStarted","Data":"5b3c4356f29afd5b3d9ec8b0587c2ad0edebe8c3ef4cf47441ff3d6ae2b5b05e"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.636031 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" event={"ID":"78e9b53f-4e52-4b94-8685-d5e84fb0b9cd","Type":"ContainerStarted","Data":"5f7993f2d540fd8108f5b4c02b07706b6cafaa10e63c910de89cbc3f5f8f0b3d"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.642841 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:11 crc kubenswrapper[4563]: E1124 09:06:11.643797 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:12.143775438 +0000 UTC m=+149.402752884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.657345 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wsk9h" event={"ID":"daf55391-e211-42e2-9159-29a347fa220e","Type":"ContainerStarted","Data":"bddd73f84a94fc63ccedf4f621436a1ad775384c3bff745afe7d4c302e347ebe"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.657378 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wsk9h" event={"ID":"daf55391-e211-42e2-9159-29a347fa220e","Type":"ContainerStarted","Data":"5e5882e575a7caa66b0c9a76c352f0e3bd482b9bd9607bff5694f79aa77d3a91"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.662408 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" event={"ID":"deb0f70e-c451-439a-afb4-c26f327638d9","Type":"ContainerStarted","Data":"70bf2213c13e0ecff6bd5dc1c39023b7c946360f290cd0fd0961df031eb7c8fc"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.662452 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" event={"ID":"deb0f70e-c451-439a-afb4-c26f327638d9","Type":"ContainerStarted","Data":"30f50d86a24fb45f6d0e0b4fe80527c2e9c296336c633da04e50618e52f15a2b"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.671095 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" podStartSLOduration=126.671063353 podStartE2EDuration="2m6.671063353s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:11.656828146 +0000 UTC m=+148.915805593" watchObservedRunningTime="2025-11-24 09:06:11.671063353 +0000 UTC m=+148.930040800" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.678066 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7" event={"ID":"b028b276-09cb-4d47-af70-1790128259df","Type":"ContainerStarted","Data":"19a5bd1b5503a9e34596abc624536b1b367396338010121c39c10e98108d11c6"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.678113 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7" event={"ID":"b028b276-09cb-4d47-af70-1790128259df","Type":"ContainerStarted","Data":"b442d1ce10e383f29f67a2c5f0ac20410a11382933f350546f900e1f2e878637"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.679987 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6l6vs" podStartSLOduration=126.679965061 podStartE2EDuration="2m6.679965061s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:11.67866019 +0000 UTC m=+148.937637636" watchObservedRunningTime="2025-11-24 09:06:11.679965061 +0000 UTC m=+148.938942509" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.682959 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-glp9q" event={"ID":"cd9fea55-0ca7-4676-b5b7-53231b66d601","Type":"ContainerStarted","Data":"ab22418e92724547aaede91aae364e5f468fc388d1db55f1a4143ee89669270d"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.682991 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-glp9q" event={"ID":"cd9fea55-0ca7-4676-b5b7-53231b66d601","Type":"ContainerStarted","Data":"3c71762bc45098ba570a2adbf6f86087f2e92ed08ec5ffae221101db09e08ea4"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.696627 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" podStartSLOduration=126.696608681 podStartE2EDuration="2m6.696608681s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:11.694502858 +0000 UTC m=+148.953480306" watchObservedRunningTime="2025-11-24 09:06:11.696608681 +0000 UTC m=+148.955586127" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.697564 4563 generic.go:334] "Generic (PLEG): container finished" podID="20670dac-a915-49d2-8953-cb842980ca87" containerID="91026758f70e5e0924ffbdea8bb844a38ab8b88b6aba0f5d54f15c7c5de6eb33" exitCode=0 Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.697655 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" event={"ID":"20670dac-a915-49d2-8953-cb842980ca87","Type":"ContainerStarted","Data":"c68c8d1b7669f82704f73def247c49d953f1c9ffb8a393a0242a8c1a7b18ab73"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.697683 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" event={"ID":"20670dac-a915-49d2-8953-cb842980ca87","Type":"ContainerDied","Data":"91026758f70e5e0924ffbdea8bb844a38ab8b88b6aba0f5d54f15c7c5de6eb33"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.709789 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" event={"ID":"e584f76f-d222-42b7-bfad-8190793ade5c","Type":"ContainerStarted","Data":"561a550d0a302559304b91e4f5efaefab70076c8ba9ca46bfa77d711a92a14ac"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.709822 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" event={"ID":"e584f76f-d222-42b7-bfad-8190793ade5c","Type":"ContainerStarted","Data":"dfaed7161e47563fae32b41480f5b53d4ae3c98cad3900b5f206953654cedb53"} Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.712753 4563 patch_prober.go:28] interesting pod/downloads-7954f5f757-tzmwx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.712789 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-tzmwx" podUID="b5a091e0-6549-4f21-a0dc-5f7452dc9c0f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.715401 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-nsrtq" podStartSLOduration=126.715370895 podStartE2EDuration="2m6.715370895s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:11.713332421 +0000 UTC m=+148.972309867" watchObservedRunningTime="2025-11-24 09:06:11.715370895 +0000 UTC m=+148.974348342" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.740271 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-92skl" podStartSLOduration=126.740257812 podStartE2EDuration="2m6.740257812s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:11.737915534 +0000 UTC m=+148.996892981" watchObservedRunningTime="2025-11-24 09:06:11.740257812 +0000 UTC m=+148.999235259" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.744229 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:11 crc kubenswrapper[4563]: E1124 09:06:11.748209 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:12.248199088 +0000 UTC m=+149.507176536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.762191 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" podStartSLOduration=126.762182321 podStartE2EDuration="2m6.762182321s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:11.761377153 +0000 UTC m=+149.020354599" watchObservedRunningTime="2025-11-24 09:06:11.762182321 +0000 UTC m=+149.021159769" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.820848 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9b6t7" podStartSLOduration=126.82083034 podStartE2EDuration="2m6.82083034s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:11.820775025 +0000 UTC m=+149.079752473" watchObservedRunningTime="2025-11-24 09:06:11.82083034 +0000 UTC m=+149.079807787" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.821090 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wsk9h" podStartSLOduration=6.821085931 podStartE2EDuration="6.821085931s" podCreationTimestamp="2025-11-24 09:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:11.794262173 +0000 UTC m=+149.053239619" watchObservedRunningTime="2025-11-24 09:06:11.821085931 +0000 UTC m=+149.080063379" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.845028 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:11 crc kubenswrapper[4563]: E1124 09:06:11.846682 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:12.346665574 +0000 UTC m=+149.605643022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.873152 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" podStartSLOduration=126.87313701 podStartE2EDuration="2m6.87313701s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:11.844271118 +0000 UTC m=+149.103248565" watchObservedRunningTime="2025-11-24 09:06:11.87313701 +0000 UTC m=+149.132114456" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.873551 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jqkcz" podStartSLOduration=126.873547664 podStartE2EDuration="2m6.873547664s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:11.871124283 +0000 UTC m=+149.130101730" watchObservedRunningTime="2025-11-24 09:06:11.873547664 +0000 UTC m=+149.132525110" Nov 24 09:06:11 crc kubenswrapper[4563]: I1124 09:06:11.949473 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:11 crc kubenswrapper[4563]: E1124 09:06:11.949793 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:12.44978212 +0000 UTC m=+149.708759567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.050519 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:12 crc kubenswrapper[4563]: E1124 09:06:12.050668 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:12.550623725 +0000 UTC m=+149.809601173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.050948 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:12 crc kubenswrapper[4563]: E1124 09:06:12.051462 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:12.551452629 +0000 UTC m=+149.810430075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.125174 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.152278 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:12 crc kubenswrapper[4563]: E1124 09:06:12.152437 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:12.652413649 +0000 UTC m=+149.911391096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.152632 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:12 crc kubenswrapper[4563]: E1124 09:06:12.152934 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:12.652924894 +0000 UTC m=+149.911902341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.241154 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:12 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:12 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:12 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.241206 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.253846 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.253902 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f15c383-9eb5-4942-a63d-48e54beea23d-config-volume\") pod \"6f15c383-9eb5-4942-a63d-48e54beea23d\" (UID: \"6f15c383-9eb5-4942-a63d-48e54beea23d\") " Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.253932 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f15c383-9eb5-4942-a63d-48e54beea23d-secret-volume\") pod \"6f15c383-9eb5-4942-a63d-48e54beea23d\" (UID: \"6f15c383-9eb5-4942-a63d-48e54beea23d\") " Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.253999 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh8l6\" (UniqueName: \"kubernetes.io/projected/6f15c383-9eb5-4942-a63d-48e54beea23d-kube-api-access-rh8l6\") pod \"6f15c383-9eb5-4942-a63d-48e54beea23d\" (UID: \"6f15c383-9eb5-4942-a63d-48e54beea23d\") " Nov 24 09:06:12 crc kubenswrapper[4563]: E1124 09:06:12.254703 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:12.754680552 +0000 UTC m=+150.013657998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.255075 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f15c383-9eb5-4942-a63d-48e54beea23d-config-volume" (OuterVolumeSpecName: "config-volume") pod "6f15c383-9eb5-4942-a63d-48e54beea23d" (UID: "6f15c383-9eb5-4942-a63d-48e54beea23d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.261019 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f15c383-9eb5-4942-a63d-48e54beea23d-kube-api-access-rh8l6" (OuterVolumeSpecName: "kube-api-access-rh8l6") pod "6f15c383-9eb5-4942-a63d-48e54beea23d" (UID: "6f15c383-9eb5-4942-a63d-48e54beea23d"). InnerVolumeSpecName "kube-api-access-rh8l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.262839 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f15c383-9eb5-4942-a63d-48e54beea23d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6f15c383-9eb5-4942-a63d-48e54beea23d" (UID: "6f15c383-9eb5-4942-a63d-48e54beea23d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.355748 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.355853 4563 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f15c383-9eb5-4942-a63d-48e54beea23d-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.355865 4563 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f15c383-9eb5-4942-a63d-48e54beea23d-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.355875 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh8l6\" (UniqueName: \"kubernetes.io/projected/6f15c383-9eb5-4942-a63d-48e54beea23d-kube-api-access-rh8l6\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:12 crc kubenswrapper[4563]: E1124 09:06:12.356093 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:12.856081792 +0000 UTC m=+150.115059239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.457362 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:12 crc kubenswrapper[4563]: E1124 09:06:12.457725 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:12.95769307 +0000 UTC m=+150.216670516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.457879 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:12 crc kubenswrapper[4563]: E1124 09:06:12.458186 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:12.958174738 +0000 UTC m=+150.217152184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.558740 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:12 crc kubenswrapper[4563]: E1124 09:06:12.559016 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:13.058998678 +0000 UTC m=+150.317976126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.559097 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:12 crc kubenswrapper[4563]: E1124 09:06:12.559393 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:13.059381782 +0000 UTC m=+150.318359230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.661028 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:12 crc kubenswrapper[4563]: E1124 09:06:12.661343 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:13.161330185 +0000 UTC m=+150.420307632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.749009 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vpxzg" event={"ID":"deb0f70e-c451-439a-afb4-c26f327638d9","Type":"ContainerStarted","Data":"20028746df86f83ce683a18328a74bc65b1b615eee7cb4f093d53dd4333894bf"} Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.754342 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.754445 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.758890 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" event={"ID":"9350d335-c1d0-4315-9da0-75a6bd635efc","Type":"ContainerStarted","Data":"01bd85672f8a29a9e0b697b6e0b14b7df06051ab9fc0f4b842745ec73a0273c3"} Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.759988 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.762076 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:12 crc kubenswrapper[4563]: E1124 09:06:12.762381 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:13.262369673 +0000 UTC m=+150.521347120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.764104 4563 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.764277 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-glp9q" event={"ID":"cd9fea55-0ca7-4676-b5b7-53231b66d601","Type":"ContainerStarted","Data":"11e3be0e92149722f4ab166ba2ef80585ebd1ca179a0af3ef1999d0e02a536c0"} Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.765092 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-glp9q" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.778092 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.778463 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5" event={"ID":"6f15c383-9eb5-4942-a63d-48e54beea23d","Type":"ContainerDied","Data":"6773ada90c039f9e8576a8f046592ec013cec3d230d5a91b2f21e8d5b7e60f79"} Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.778501 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6773ada90c039f9e8576a8f046592ec013cec3d230d5a91b2f21e8d5b7e60f79" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.782107 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.782349 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.792438 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" podStartSLOduration=127.792428673 podStartE2EDuration="2m7.792428673s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:12.792327211 +0000 UTC m=+150.051304659" watchObservedRunningTime="2025-11-24 09:06:12.792428673 +0000 UTC m=+150.051406120" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.803159 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" event={"ID":"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66","Type":"ContainerStarted","Data":"eadb29840094e3208ec3eee676b2d491f40a7590d5ad86c5b1e2d63887cf22fa"} Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.803193 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" event={"ID":"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66","Type":"ContainerStarted","Data":"d35f434a6a7690a5aaea001b8e1fcf108bc05fcecf0bd2654cfda6983bcdb724"} Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.803203 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" event={"ID":"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66","Type":"ContainerStarted","Data":"ddade5a53db4d51bde624d46aa99d95df235b5cba0bc148cf0bd1a4f7916f036"} Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.814158 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.818133 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jj9mw" event={"ID":"e11ba0a3-483c-4306-a89f-61f79a52b10d","Type":"ContainerStarted","Data":"bf5d50ee1cdff1c81561a22c02cd931a6c1fc4a7cc1e6b8562ba89825f535f62"} Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.832341 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-thzh7" event={"ID":"00de88df-032d-4a2a-aa57-db8080b919bf","Type":"ContainerStarted","Data":"b9045cc33feeec4b89e4f23d9e38055b539ab5a0c71fb76001133c4c07346ba5"} Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.832741 4563 patch_prober.go:28] interesting pod/apiserver-76f77b778f-l6b8r container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 24 09:06:12 crc kubenswrapper[4563]: [+]log ok Nov 24 09:06:12 crc kubenswrapper[4563]: [+]etcd ok Nov 24 09:06:12 crc kubenswrapper[4563]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 24 09:06:12 crc kubenswrapper[4563]: [+]poststarthook/generic-apiserver-start-informers ok Nov 24 09:06:12 crc kubenswrapper[4563]: [+]poststarthook/max-in-flight-filter ok Nov 24 09:06:12 crc kubenswrapper[4563]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 24 09:06:12 crc kubenswrapper[4563]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 24 09:06:12 crc kubenswrapper[4563]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 24 09:06:12 crc kubenswrapper[4563]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Nov 24 09:06:12 crc kubenswrapper[4563]: [+]poststarthook/project.openshift.io-projectcache ok Nov 24 09:06:12 crc kubenswrapper[4563]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 24 09:06:12 crc kubenswrapper[4563]: [+]poststarthook/openshift.io-startinformers ok Nov 24 09:06:12 crc kubenswrapper[4563]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 24 09:06:12 crc kubenswrapper[4563]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 24 09:06:12 crc kubenswrapper[4563]: livez check failed Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.832790 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" podUID="a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.840303 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" event={"ID":"a0ec4c32-d6bd-4f38-b9ef-c10fa21d3b51","Type":"ContainerStarted","Data":"bd9a6b1e4a5bbe6401948297f35c66dfce4395e6542b6a15411e17702f998a0e"} Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.845210 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rshgn" event={"ID":"25f557a3-e5cd-4355-9e52-b7542c0103d2","Type":"ContainerStarted","Data":"fe61730996ffb63796778c6b23f9f2b3b469fdce38d5a97255b2dfc111e179d2"} Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.857668 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-glp9q" podStartSLOduration=7.85765042 podStartE2EDuration="7.85765042s" podCreationTimestamp="2025-11-24 09:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:12.81878477 +0000 UTC m=+150.077762207" watchObservedRunningTime="2025-11-24 09:06:12.85765042 +0000 UTC m=+150.116627857" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.858203 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5mx8n" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.859382 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jbgzc" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.859607 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4zwrj" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.862778 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:12 crc kubenswrapper[4563]: E1124 09:06:12.863780 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:13.363766115 +0000 UTC m=+150.622743561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.899502 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-jj9mw" podStartSLOduration=127.899486191 podStartE2EDuration="2m7.899486191s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:12.869864806 +0000 UTC m=+150.128842253" watchObservedRunningTime="2025-11-24 09:06:12.899486191 +0000 UTC m=+150.158463639" Nov 24 09:06:12 crc kubenswrapper[4563]: I1124 09:06:12.968429 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:12 crc kubenswrapper[4563]: E1124 09:06:12.979347 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:13.479331087 +0000 UTC m=+150.738308534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.072931 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:13 crc kubenswrapper[4563]: E1124 09:06:13.073319 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-24 09:06:13.573306883 +0000 UTC m=+150.832284331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.081474 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-rshgn" podStartSLOduration=128.08144898 podStartE2EDuration="2m8.08144898s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:12.967077256 +0000 UTC m=+150.226054704" watchObservedRunningTime="2025-11-24 09:06:13.08144898 +0000 UTC m=+150.340426426" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.174244 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:13 crc kubenswrapper[4563]: E1124 09:06:13.174719 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-24 09:06:13.674698957 +0000 UTC m=+150.933676404 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mm4b8" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.204288 4563 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-24T09:06:12.764127458Z","Handler":null,"Name":""} Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.214120 4563 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.214161 4563 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.243117 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:13 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:13 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:13 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.243194 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.275333 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.282185 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.377720 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.382559 4563 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.382922 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.425678 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mm4b8\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.552045 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.696853 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mm4b8"] Nov 24 09:06:13 crc kubenswrapper[4563]: W1124 09:06:13.705800 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4952a751_3601_4381_9b92_d5d720b6dca2.slice/crio-d71b3d4a3bc469b9cc6a65bf99e533c23e49dba8bf903e500afddfed5fb03347 WatchSource:0}: Error finding container d71b3d4a3bc469b9cc6a65bf99e533c23e49dba8bf903e500afddfed5fb03347: Status 404 returned error can't find the container with id d71b3d4a3bc469b9cc6a65bf99e533c23e49dba8bf903e500afddfed5fb03347 Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.832039 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fkz2h"] Nov 24 09:06:13 crc kubenswrapper[4563]: E1124 09:06:13.832296 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f15c383-9eb5-4942-a63d-48e54beea23d" containerName="collect-profiles" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.832315 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f15c383-9eb5-4942-a63d-48e54beea23d" containerName="collect-profiles" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.832435 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f15c383-9eb5-4942-a63d-48e54beea23d" containerName="collect-profiles" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.833256 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.834790 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.843655 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fkz2h"] Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.850327 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" event={"ID":"4952a751-3601-4381-9b92-d5d720b6dca2","Type":"ContainerStarted","Data":"5e7395a329c40d1d263e6ebd6b5c7f6cb277a303f2475f4549d0a8005a04e9c5"} Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.850357 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" event={"ID":"4952a751-3601-4381-9b92-d5d720b6dca2","Type":"ContainerStarted","Data":"d71b3d4a3bc469b9cc6a65bf99e533c23e49dba8bf903e500afddfed5fb03347"} Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.850938 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.865379 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" event={"ID":"e39b1d60-c462-4a1e-b8bf-d0d90bfc5a66","Type":"ContainerStarted","Data":"3368319982ed246a30013f138ba453cfac5c0723acf4a06e7024640d6c56c9ed"} Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.879548 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" podStartSLOduration=128.879530784 podStartE2EDuration="2m8.879530784s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:13.877058261 +0000 UTC m=+151.136035708" watchObservedRunningTime="2025-11-24 09:06:13.879530784 +0000 UTC m=+151.138508231" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.890907 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-zq6xk" podStartSLOduration=8.890884257 podStartE2EDuration="8.890884257s" podCreationTimestamp="2025-11-24 09:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:13.890312368 +0000 UTC m=+151.149289816" watchObservedRunningTime="2025-11-24 09:06:13.890884257 +0000 UTC m=+151.149861704" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.985545 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8c919bf-e04d-4c09-84fa-064f434383bb-utilities\") pod \"certified-operators-fkz2h\" (UID: \"b8c919bf-e04d-4c09-84fa-064f434383bb\") " pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.985725 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pthfb\" (UniqueName: \"kubernetes.io/projected/b8c919bf-e04d-4c09-84fa-064f434383bb-kube-api-access-pthfb\") pod \"certified-operators-fkz2h\" (UID: \"b8c919bf-e04d-4c09-84fa-064f434383bb\") " pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:13 crc kubenswrapper[4563]: I1124 09:06:13.986033 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8c919bf-e04d-4c09-84fa-064f434383bb-catalog-content\") pod \"certified-operators-fkz2h\" (UID: \"b8c919bf-e04d-4c09-84fa-064f434383bb\") " pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.020356 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jhn4v"] Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.021346 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.022875 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.057679 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.062239 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jhn4v"] Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.087099 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8c919bf-e04d-4c09-84fa-064f434383bb-catalog-content\") pod \"certified-operators-fkz2h\" (UID: \"b8c919bf-e04d-4c09-84fa-064f434383bb\") " pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.087165 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8c919bf-e04d-4c09-84fa-064f434383bb-utilities\") pod \"certified-operators-fkz2h\" (UID: \"b8c919bf-e04d-4c09-84fa-064f434383bb\") " pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.087213 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pthfb\" (UniqueName: \"kubernetes.io/projected/b8c919bf-e04d-4c09-84fa-064f434383bb-kube-api-access-pthfb\") pod \"certified-operators-fkz2h\" (UID: \"b8c919bf-e04d-4c09-84fa-064f434383bb\") " pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.087730 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8c919bf-e04d-4c09-84fa-064f434383bb-utilities\") pod \"certified-operators-fkz2h\" (UID: \"b8c919bf-e04d-4c09-84fa-064f434383bb\") " pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.087932 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8c919bf-e04d-4c09-84fa-064f434383bb-catalog-content\") pod \"certified-operators-fkz2h\" (UID: \"b8c919bf-e04d-4c09-84fa-064f434383bb\") " pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.109250 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pthfb\" (UniqueName: \"kubernetes.io/projected/b8c919bf-e04d-4c09-84fa-064f434383bb-kube-api-access-pthfb\") pod \"certified-operators-fkz2h\" (UID: \"b8c919bf-e04d-4c09-84fa-064f434383bb\") " pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.137846 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmv44" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.144512 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.189079 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c852825f-7fac-45d3-b801-ae6eb253989a-utilities\") pod \"community-operators-jhn4v\" (UID: \"c852825f-7fac-45d3-b801-ae6eb253989a\") " pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.189231 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw8cf\" (UniqueName: \"kubernetes.io/projected/c852825f-7fac-45d3-b801-ae6eb253989a-kube-api-access-rw8cf\") pod \"community-operators-jhn4v\" (UID: \"c852825f-7fac-45d3-b801-ae6eb253989a\") " pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.189524 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c852825f-7fac-45d3-b801-ae6eb253989a-catalog-content\") pod \"community-operators-jhn4v\" (UID: \"c852825f-7fac-45d3-b801-ae6eb253989a\") " pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.220208 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tfx45"] Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.221177 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.243051 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:14 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:14 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:14 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.243131 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.252684 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tfx45"] Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.291754 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c852825f-7fac-45d3-b801-ae6eb253989a-utilities\") pod \"community-operators-jhn4v\" (UID: \"c852825f-7fac-45d3-b801-ae6eb253989a\") " pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.291810 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw8cf\" (UniqueName: \"kubernetes.io/projected/c852825f-7fac-45d3-b801-ae6eb253989a-kube-api-access-rw8cf\") pod \"community-operators-jhn4v\" (UID: \"c852825f-7fac-45d3-b801-ae6eb253989a\") " pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.291867 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c852825f-7fac-45d3-b801-ae6eb253989a-catalog-content\") pod \"community-operators-jhn4v\" (UID: \"c852825f-7fac-45d3-b801-ae6eb253989a\") " pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.292287 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c852825f-7fac-45d3-b801-ae6eb253989a-catalog-content\") pod \"community-operators-jhn4v\" (UID: \"c852825f-7fac-45d3-b801-ae6eb253989a\") " pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.292498 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c852825f-7fac-45d3-b801-ae6eb253989a-utilities\") pod \"community-operators-jhn4v\" (UID: \"c852825f-7fac-45d3-b801-ae6eb253989a\") " pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.309743 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.311009 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.311528 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw8cf\" (UniqueName: \"kubernetes.io/projected/c852825f-7fac-45d3-b801-ae6eb253989a-kube-api-access-rw8cf\") pod \"community-operators-jhn4v\" (UID: \"c852825f-7fac-45d3-b801-ae6eb253989a\") " pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.316989 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.317292 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.317773 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.332383 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.348230 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fkz2h"] Nov 24 09:06:14 crc kubenswrapper[4563]: W1124 09:06:14.358049 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8c919bf_e04d_4c09_84fa_064f434383bb.slice/crio-aa0d08f4669337a7e1f3a6b8538bbe8bb26fc7fe1aa90d36bd878047f72ec099 WatchSource:0}: Error finding container aa0d08f4669337a7e1f3a6b8538bbe8bb26fc7fe1aa90d36bd878047f72ec099: Status 404 returned error can't find the container with id aa0d08f4669337a7e1f3a6b8538bbe8bb26fc7fe1aa90d36bd878047f72ec099 Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.393322 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5917ce-6f12-449f-9637-a94eff40aea4-utilities\") pod \"certified-operators-tfx45\" (UID: \"0b5917ce-6f12-449f-9637-a94eff40aea4\") " pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.393386 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf5fr\" (UniqueName: \"kubernetes.io/projected/0b5917ce-6f12-449f-9637-a94eff40aea4-kube-api-access-gf5fr\") pod \"certified-operators-tfx45\" (UID: \"0b5917ce-6f12-449f-9637-a94eff40aea4\") " pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.393406 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5917ce-6f12-449f-9637-a94eff40aea4-catalog-content\") pod \"certified-operators-tfx45\" (UID: \"0b5917ce-6f12-449f-9637-a94eff40aea4\") " pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.419655 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xt68x"] Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.420631 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.435368 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xt68x"] Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.494885 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf5fr\" (UniqueName: \"kubernetes.io/projected/0b5917ce-6f12-449f-9637-a94eff40aea4-kube-api-access-gf5fr\") pod \"certified-operators-tfx45\" (UID: \"0b5917ce-6f12-449f-9637-a94eff40aea4\") " pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.494932 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5917ce-6f12-449f-9637-a94eff40aea4-catalog-content\") pod \"certified-operators-tfx45\" (UID: \"0b5917ce-6f12-449f-9637-a94eff40aea4\") " pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.494970 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/61e55d3d-eab4-4f40-967f-87942bfc05bb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"61e55d3d-eab4-4f40-967f-87942bfc05bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.495028 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61e55d3d-eab4-4f40-967f-87942bfc05bb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"61e55d3d-eab4-4f40-967f-87942bfc05bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.495077 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5917ce-6f12-449f-9637-a94eff40aea4-utilities\") pod \"certified-operators-tfx45\" (UID: \"0b5917ce-6f12-449f-9637-a94eff40aea4\") " pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.495814 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5917ce-6f12-449f-9637-a94eff40aea4-catalog-content\") pod \"certified-operators-tfx45\" (UID: \"0b5917ce-6f12-449f-9637-a94eff40aea4\") " pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.495837 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5917ce-6f12-449f-9637-a94eff40aea4-utilities\") pod \"certified-operators-tfx45\" (UID: \"0b5917ce-6f12-449f-9637-a94eff40aea4\") " pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.510626 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf5fr\" (UniqueName: \"kubernetes.io/projected/0b5917ce-6f12-449f-9637-a94eff40aea4-kube-api-access-gf5fr\") pod \"certified-operators-tfx45\" (UID: \"0b5917ce-6f12-449f-9637-a94eff40aea4\") " pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.511309 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jhn4v"] Nov 24 09:06:14 crc kubenswrapper[4563]: W1124 09:06:14.542459 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc852825f_7fac_45d3_b801_ae6eb253989a.slice/crio-45f128b334beed5a49273d65c5bcd6ed1801f80f16074811720e183272613d13 WatchSource:0}: Error finding container 45f128b334beed5a49273d65c5bcd6ed1801f80f16074811720e183272613d13: Status 404 returned error can't find the container with id 45f128b334beed5a49273d65c5bcd6ed1801f80f16074811720e183272613d13 Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.543840 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.596893 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61e55d3d-eab4-4f40-967f-87942bfc05bb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"61e55d3d-eab4-4f40-967f-87942bfc05bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.596980 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp5rh\" (UniqueName: \"kubernetes.io/projected/d318ef99-9cb4-4f69-81dc-183d64e7532c-kube-api-access-xp5rh\") pod \"community-operators-xt68x\" (UID: \"d318ef99-9cb4-4f69-81dc-183d64e7532c\") " pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.597006 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d318ef99-9cb4-4f69-81dc-183d64e7532c-catalog-content\") pod \"community-operators-xt68x\" (UID: \"d318ef99-9cb4-4f69-81dc-183d64e7532c\") " pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.597047 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/61e55d3d-eab4-4f40-967f-87942bfc05bb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"61e55d3d-eab4-4f40-967f-87942bfc05bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.597068 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d318ef99-9cb4-4f69-81dc-183d64e7532c-utilities\") pod \"community-operators-xt68x\" (UID: \"d318ef99-9cb4-4f69-81dc-183d64e7532c\") " pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.597167 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61e55d3d-eab4-4f40-967f-87942bfc05bb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"61e55d3d-eab4-4f40-967f-87942bfc05bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.613438 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/61e55d3d-eab4-4f40-967f-87942bfc05bb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"61e55d3d-eab4-4f40-967f-87942bfc05bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.623728 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.697976 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tfx45"] Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.698537 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d318ef99-9cb4-4f69-81dc-183d64e7532c-utilities\") pod \"community-operators-xt68x\" (UID: \"d318ef99-9cb4-4f69-81dc-183d64e7532c\") " pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.698678 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp5rh\" (UniqueName: \"kubernetes.io/projected/d318ef99-9cb4-4f69-81dc-183d64e7532c-kube-api-access-xp5rh\") pod \"community-operators-xt68x\" (UID: \"d318ef99-9cb4-4f69-81dc-183d64e7532c\") " pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.698980 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d318ef99-9cb4-4f69-81dc-183d64e7532c-catalog-content\") pod \"community-operators-xt68x\" (UID: \"d318ef99-9cb4-4f69-81dc-183d64e7532c\") " pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.699020 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d318ef99-9cb4-4f69-81dc-183d64e7532c-utilities\") pod \"community-operators-xt68x\" (UID: \"d318ef99-9cb4-4f69-81dc-183d64e7532c\") " pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.700252 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d318ef99-9cb4-4f69-81dc-183d64e7532c-catalog-content\") pod \"community-operators-xt68x\" (UID: \"d318ef99-9cb4-4f69-81dc-183d64e7532c\") " pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.724046 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp5rh\" (UniqueName: \"kubernetes.io/projected/d318ef99-9cb4-4f69-81dc-183d64e7532c-kube-api-access-xp5rh\") pod \"community-operators-xt68x\" (UID: \"d318ef99-9cb4-4f69-81dc-183d64e7532c\") " pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.752175 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:14 crc kubenswrapper[4563]: W1124 09:06:14.757898 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b5917ce_6f12_449f_9637_a94eff40aea4.slice/crio-0056b4a985b44a58e4d8ad67018ec9759a5d7a6747437998da2e148c2a952c31 WatchSource:0}: Error finding container 0056b4a985b44a58e4d8ad67018ec9759a5d7a6747437998da2e148c2a952c31: Status 404 returned error can't find the container with id 0056b4a985b44a58e4d8ad67018ec9759a5d7a6747437998da2e148c2a952c31 Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.775071 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 24 09:06:14 crc kubenswrapper[4563]: W1124 09:06:14.782552 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod61e55d3d_eab4_4f40_967f_87942bfc05bb.slice/crio-14009cc5ebceaf47c49eb5534447ae51203cbc5039c7c63e6c24c9db2dd580e0 WatchSource:0}: Error finding container 14009cc5ebceaf47c49eb5534447ae51203cbc5039c7c63e6c24c9db2dd580e0: Status 404 returned error can't find the container with id 14009cc5ebceaf47c49eb5534447ae51203cbc5039c7c63e6c24c9db2dd580e0 Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.876380 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfx45" event={"ID":"0b5917ce-6f12-449f-9637-a94eff40aea4","Type":"ContainerStarted","Data":"14928ac4a27e070cd1b7b18ad08dd055f9c1644d22d932deab3fbb3dda843b3a"} Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.876614 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfx45" event={"ID":"0b5917ce-6f12-449f-9637-a94eff40aea4","Type":"ContainerStarted","Data":"0056b4a985b44a58e4d8ad67018ec9759a5d7a6747437998da2e148c2a952c31"} Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.884755 4563 generic.go:334] "Generic (PLEG): container finished" podID="c852825f-7fac-45d3-b801-ae6eb253989a" containerID="feb154e7eca53ac0dcf14cdc0f2bd695742fbe6124fafe2de04a6c1679f1f212" exitCode=0 Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.884827 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jhn4v" event={"ID":"c852825f-7fac-45d3-b801-ae6eb253989a","Type":"ContainerDied","Data":"feb154e7eca53ac0dcf14cdc0f2bd695742fbe6124fafe2de04a6c1679f1f212"} Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.884848 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jhn4v" event={"ID":"c852825f-7fac-45d3-b801-ae6eb253989a","Type":"ContainerStarted","Data":"45f128b334beed5a49273d65c5bcd6ed1801f80f16074811720e183272613d13"} Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.886772 4563 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.887763 4563 generic.go:334] "Generic (PLEG): container finished" podID="b8c919bf-e04d-4c09-84fa-064f434383bb" containerID="1f4162972c8803809f39b9c4014d14149a06e623b12b89ee3fcc1c89357efbe7" exitCode=0 Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.887985 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fkz2h" event={"ID":"b8c919bf-e04d-4c09-84fa-064f434383bb","Type":"ContainerDied","Data":"1f4162972c8803809f39b9c4014d14149a06e623b12b89ee3fcc1c89357efbe7"} Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.888035 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fkz2h" event={"ID":"b8c919bf-e04d-4c09-84fa-064f434383bb","Type":"ContainerStarted","Data":"aa0d08f4669337a7e1f3a6b8538bbe8bb26fc7fe1aa90d36bd878047f72ec099"} Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.895605 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"61e55d3d-eab4-4f40-967f-87942bfc05bb","Type":"ContainerStarted","Data":"14009cc5ebceaf47c49eb5534447ae51203cbc5039c7c63e6c24c9db2dd580e0"} Nov 24 09:06:14 crc kubenswrapper[4563]: I1124 09:06:14.917921 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xt68x"] Nov 24 09:06:14 crc kubenswrapper[4563]: W1124 09:06:14.925133 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd318ef99_9cb4_4f69_81dc_183d64e7532c.slice/crio-285fbf988eab26feac7c73a15fab0dbd0a44f239dc91c77be11f433edaafda7f WatchSource:0}: Error finding container 285fbf988eab26feac7c73a15fab0dbd0a44f239dc91c77be11f433edaafda7f: Status 404 returned error can't find the container with id 285fbf988eab26feac7c73a15fab0dbd0a44f239dc91c77be11f433edaafda7f Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.061404 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.241374 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:15 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:15 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:15 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.241736 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.821618 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wp6lh"] Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.822758 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.825686 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.826897 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp6lh"] Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.903575 4563 generic.go:334] "Generic (PLEG): container finished" podID="d318ef99-9cb4-4f69-81dc-183d64e7532c" containerID="4fc5e911f87a388fc9cbf7e6d876870d4c4a4faae0b248dee87f038e49271ad5" exitCode=0 Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.903653 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xt68x" event={"ID":"d318ef99-9cb4-4f69-81dc-183d64e7532c","Type":"ContainerDied","Data":"4fc5e911f87a388fc9cbf7e6d876870d4c4a4faae0b248dee87f038e49271ad5"} Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.903714 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xt68x" event={"ID":"d318ef99-9cb4-4f69-81dc-183d64e7532c","Type":"ContainerStarted","Data":"285fbf988eab26feac7c73a15fab0dbd0a44f239dc91c77be11f433edaafda7f"} Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.908817 4563 generic.go:334] "Generic (PLEG): container finished" podID="0b5917ce-6f12-449f-9637-a94eff40aea4" containerID="14928ac4a27e070cd1b7b18ad08dd055f9c1644d22d932deab3fbb3dda843b3a" exitCode=0 Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.908911 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfx45" event={"ID":"0b5917ce-6f12-449f-9637-a94eff40aea4","Type":"ContainerDied","Data":"14928ac4a27e070cd1b7b18ad08dd055f9c1644d22d932deab3fbb3dda843b3a"} Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.911096 4563 generic.go:334] "Generic (PLEG): container finished" podID="61e55d3d-eab4-4f40-967f-87942bfc05bb" containerID="0787f87802fce9d10766dfe79c30bbd0865b6c57441ac28a33b3f447b0264613" exitCode=0 Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.911160 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"61e55d3d-eab4-4f40-967f-87942bfc05bb","Type":"ContainerDied","Data":"0787f87802fce9d10766dfe79c30bbd0865b6c57441ac28a33b3f447b0264613"} Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.924144 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-utilities\") pod \"redhat-marketplace-wp6lh\" (UID: \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\") " pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.924242 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkmgq\" (UniqueName: \"kubernetes.io/projected/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-kube-api-access-qkmgq\") pod \"redhat-marketplace-wp6lh\" (UID: \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\") " pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:15 crc kubenswrapper[4563]: I1124 09:06:15.924308 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-catalog-content\") pod \"redhat-marketplace-wp6lh\" (UID: \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\") " pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.027062 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmgq\" (UniqueName: \"kubernetes.io/projected/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-kube-api-access-qkmgq\") pod \"redhat-marketplace-wp6lh\" (UID: \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\") " pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.027832 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-catalog-content\") pod \"redhat-marketplace-wp6lh\" (UID: \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\") " pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.027950 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-utilities\") pod \"redhat-marketplace-wp6lh\" (UID: \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\") " pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.028156 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-catalog-content\") pod \"redhat-marketplace-wp6lh\" (UID: \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\") " pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.028310 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-utilities\") pod \"redhat-marketplace-wp6lh\" (UID: \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\") " pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.049705 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkmgq\" (UniqueName: \"kubernetes.io/projected/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-kube-api-access-qkmgq\") pod \"redhat-marketplace-wp6lh\" (UID: \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\") " pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.142278 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.223976 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kn9j4"] Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.228936 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.235089 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kn9j4"] Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.241805 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:16 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:16 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:16 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.241844 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.333205 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp6lh"] Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.333228 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2298fd99-7188-4326-a55f-9feb9b25d03d-utilities\") pod \"redhat-marketplace-kn9j4\" (UID: \"2298fd99-7188-4326-a55f-9feb9b25d03d\") " pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.333879 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2298fd99-7188-4326-a55f-9feb9b25d03d-catalog-content\") pod \"redhat-marketplace-kn9j4\" (UID: \"2298fd99-7188-4326-a55f-9feb9b25d03d\") " pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.336028 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hsrm\" (UniqueName: \"kubernetes.io/projected/2298fd99-7188-4326-a55f-9feb9b25d03d-kube-api-access-2hsrm\") pod \"redhat-marketplace-kn9j4\" (UID: \"2298fd99-7188-4326-a55f-9feb9b25d03d\") " pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.437867 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hsrm\" (UniqueName: \"kubernetes.io/projected/2298fd99-7188-4326-a55f-9feb9b25d03d-kube-api-access-2hsrm\") pod \"redhat-marketplace-kn9j4\" (UID: \"2298fd99-7188-4326-a55f-9feb9b25d03d\") " pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.438225 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2298fd99-7188-4326-a55f-9feb9b25d03d-utilities\") pod \"redhat-marketplace-kn9j4\" (UID: \"2298fd99-7188-4326-a55f-9feb9b25d03d\") " pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.438313 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2298fd99-7188-4326-a55f-9feb9b25d03d-catalog-content\") pod \"redhat-marketplace-kn9j4\" (UID: \"2298fd99-7188-4326-a55f-9feb9b25d03d\") " pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.438701 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2298fd99-7188-4326-a55f-9feb9b25d03d-utilities\") pod \"redhat-marketplace-kn9j4\" (UID: \"2298fd99-7188-4326-a55f-9feb9b25d03d\") " pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.438825 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2298fd99-7188-4326-a55f-9feb9b25d03d-catalog-content\") pod \"redhat-marketplace-kn9j4\" (UID: \"2298fd99-7188-4326-a55f-9feb9b25d03d\") " pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.454630 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hsrm\" (UniqueName: \"kubernetes.io/projected/2298fd99-7188-4326-a55f-9feb9b25d03d-kube-api-access-2hsrm\") pod \"redhat-marketplace-kn9j4\" (UID: \"2298fd99-7188-4326-a55f-9feb9b25d03d\") " pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.547630 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.739184 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kn9j4"] Nov 24 09:06:16 crc kubenswrapper[4563]: W1124 09:06:16.747631 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2298fd99_7188_4326_a55f_9feb9b25d03d.slice/crio-49108e687fcd1363e94c5c88e9ab14f0d51a638cb4d234d0f0fdaec2e62a843e WatchSource:0}: Error finding container 49108e687fcd1363e94c5c88e9ab14f0d51a638cb4d234d0f0fdaec2e62a843e: Status 404 returned error can't find the container with id 49108e687fcd1363e94c5c88e9ab14f0d51a638cb4d234d0f0fdaec2e62a843e Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.917994 4563 generic.go:334] "Generic (PLEG): container finished" podID="2298fd99-7188-4326-a55f-9feb9b25d03d" containerID="344313d4c7a3a6330cc0866ff3c8bfcfaab4a7c0b92194708b363717ec4c65bd" exitCode=0 Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.918207 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kn9j4" event={"ID":"2298fd99-7188-4326-a55f-9feb9b25d03d","Type":"ContainerDied","Data":"344313d4c7a3a6330cc0866ff3c8bfcfaab4a7c0b92194708b363717ec4c65bd"} Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.918363 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kn9j4" event={"ID":"2298fd99-7188-4326-a55f-9feb9b25d03d","Type":"ContainerStarted","Data":"49108e687fcd1363e94c5c88e9ab14f0d51a638cb4d234d0f0fdaec2e62a843e"} Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.921482 4563 generic.go:334] "Generic (PLEG): container finished" podID="41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" containerID="78c7685b989084a92e118cb38e1e0c74fc250f824d0a0f998a7b59173d947a82" exitCode=0 Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.922006 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp6lh" event={"ID":"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e","Type":"ContainerDied","Data":"78c7685b989084a92e118cb38e1e0c74fc250f824d0a0f998a7b59173d947a82"} Nov 24 09:06:16 crc kubenswrapper[4563]: I1124 09:06:16.922042 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp6lh" event={"ID":"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e","Type":"ContainerStarted","Data":"a5ebc21ba3361324260a7420c40362257ef2ba5fe7f8935bca872321353877bd"} Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.025845 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kltmd"] Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.026835 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.030441 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.038138 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kltmd"] Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.135265 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.152816 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d7e769-e95c-468d-b220-1fae07708825-utilities\") pod \"redhat-operators-kltmd\" (UID: \"10d7e769-e95c-468d-b220-1fae07708825\") " pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.152851 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26hcg\" (UniqueName: \"kubernetes.io/projected/10d7e769-e95c-468d-b220-1fae07708825-kube-api-access-26hcg\") pod \"redhat-operators-kltmd\" (UID: \"10d7e769-e95c-468d-b220-1fae07708825\") " pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.152896 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d7e769-e95c-468d-b220-1fae07708825-catalog-content\") pod \"redhat-operators-kltmd\" (UID: \"10d7e769-e95c-468d-b220-1fae07708825\") " pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.211940 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 24 09:06:17 crc kubenswrapper[4563]: E1124 09:06:17.212160 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61e55d3d-eab4-4f40-967f-87942bfc05bb" containerName="pruner" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.212178 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e55d3d-eab4-4f40-967f-87942bfc05bb" containerName="pruner" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.212277 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="61e55d3d-eab4-4f40-967f-87942bfc05bb" containerName="pruner" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.212611 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.216038 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.216052 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.219656 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.242020 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:17 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:17 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:17 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.242072 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.254109 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61e55d3d-eab4-4f40-967f-87942bfc05bb-kubelet-dir\") pod \"61e55d3d-eab4-4f40-967f-87942bfc05bb\" (UID: \"61e55d3d-eab4-4f40-967f-87942bfc05bb\") " Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.254163 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/61e55d3d-eab4-4f40-967f-87942bfc05bb-kube-api-access\") pod \"61e55d3d-eab4-4f40-967f-87942bfc05bb\" (UID: \"61e55d3d-eab4-4f40-967f-87942bfc05bb\") " Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.254160 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61e55d3d-eab4-4f40-967f-87942bfc05bb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "61e55d3d-eab4-4f40-967f-87942bfc05bb" (UID: "61e55d3d-eab4-4f40-967f-87942bfc05bb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.254355 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d7e769-e95c-468d-b220-1fae07708825-utilities\") pod \"redhat-operators-kltmd\" (UID: \"10d7e769-e95c-468d-b220-1fae07708825\") " pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.254374 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26hcg\" (UniqueName: \"kubernetes.io/projected/10d7e769-e95c-468d-b220-1fae07708825-kube-api-access-26hcg\") pod \"redhat-operators-kltmd\" (UID: \"10d7e769-e95c-468d-b220-1fae07708825\") " pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.254416 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d7e769-e95c-468d-b220-1fae07708825-catalog-content\") pod \"redhat-operators-kltmd\" (UID: \"10d7e769-e95c-468d-b220-1fae07708825\") " pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.254476 4563 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61e55d3d-eab4-4f40-967f-87942bfc05bb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.254841 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d7e769-e95c-468d-b220-1fae07708825-catalog-content\") pod \"redhat-operators-kltmd\" (UID: \"10d7e769-e95c-468d-b220-1fae07708825\") " pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.255557 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d7e769-e95c-468d-b220-1fae07708825-utilities\") pod \"redhat-operators-kltmd\" (UID: \"10d7e769-e95c-468d-b220-1fae07708825\") " pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.258848 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61e55d3d-eab4-4f40-967f-87942bfc05bb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "61e55d3d-eab4-4f40-967f-87942bfc05bb" (UID: "61e55d3d-eab4-4f40-967f-87942bfc05bb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.267821 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26hcg\" (UniqueName: \"kubernetes.io/projected/10d7e769-e95c-468d-b220-1fae07708825-kube-api-access-26hcg\") pod \"redhat-operators-kltmd\" (UID: \"10d7e769-e95c-468d-b220-1fae07708825\") " pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.346622 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.355169 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/020dab9d-2ca7-4fc4-b6c1-4094bd50677a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"020dab9d-2ca7-4fc4-b6c1-4094bd50677a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.355208 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/020dab9d-2ca7-4fc4-b6c1-4094bd50677a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"020dab9d-2ca7-4fc4-b6c1-4094bd50677a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.355618 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/61e55d3d-eab4-4f40-967f-87942bfc05bb-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.416529 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zjn6s"] Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.417618 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.422181 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjn6s"] Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.457443 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/020dab9d-2ca7-4fc4-b6c1-4094bd50677a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"020dab9d-2ca7-4fc4-b6c1-4094bd50677a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.457771 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/020dab9d-2ca7-4fc4-b6c1-4094bd50677a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"020dab9d-2ca7-4fc4-b6c1-4094bd50677a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.457974 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/020dab9d-2ca7-4fc4-b6c1-4094bd50677a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"020dab9d-2ca7-4fc4-b6c1-4094bd50677a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.474767 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/020dab9d-2ca7-4fc4-b6c1-4094bd50677a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"020dab9d-2ca7-4fc4-b6c1-4094bd50677a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.529745 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.558495 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a8c9f6-6239-4258-8b3a-1d872a211813-catalog-content\") pod \"redhat-operators-zjn6s\" (UID: \"a4a8c9f6-6239-4258-8b3a-1d872a211813\") " pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.558565 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn9st\" (UniqueName: \"kubernetes.io/projected/a4a8c9f6-6239-4258-8b3a-1d872a211813-kube-api-access-gn9st\") pod \"redhat-operators-zjn6s\" (UID: \"a4a8c9f6-6239-4258-8b3a-1d872a211813\") " pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.558650 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a8c9f6-6239-4258-8b3a-1d872a211813-utilities\") pod \"redhat-operators-zjn6s\" (UID: \"a4a8c9f6-6239-4258-8b3a-1d872a211813\") " pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.658650 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.659834 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn9st\" (UniqueName: \"kubernetes.io/projected/a4a8c9f6-6239-4258-8b3a-1d872a211813-kube-api-access-gn9st\") pod \"redhat-operators-zjn6s\" (UID: \"a4a8c9f6-6239-4258-8b3a-1d872a211813\") " pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.659896 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a8c9f6-6239-4258-8b3a-1d872a211813-utilities\") pod \"redhat-operators-zjn6s\" (UID: \"a4a8c9f6-6239-4258-8b3a-1d872a211813\") " pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.659941 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a8c9f6-6239-4258-8b3a-1d872a211813-catalog-content\") pod \"redhat-operators-zjn6s\" (UID: \"a4a8c9f6-6239-4258-8b3a-1d872a211813\") " pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.660327 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a8c9f6-6239-4258-8b3a-1d872a211813-catalog-content\") pod \"redhat-operators-zjn6s\" (UID: \"a4a8c9f6-6239-4258-8b3a-1d872a211813\") " pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.660403 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a8c9f6-6239-4258-8b3a-1d872a211813-utilities\") pod \"redhat-operators-zjn6s\" (UID: \"a4a8c9f6-6239-4258-8b3a-1d872a211813\") " pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.673301 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn9st\" (UniqueName: \"kubernetes.io/projected/a4a8c9f6-6239-4258-8b3a-1d872a211813-kube-api-access-gn9st\") pod \"redhat-operators-zjn6s\" (UID: \"a4a8c9f6-6239-4258-8b3a-1d872a211813\") " pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.752016 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.785872 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.789850 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-l6b8r" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.821880 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-tzmwx" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.826210 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.826237 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.831385 4563 patch_prober.go:28] interesting pod/console-f9d7485db-7hx7w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.831417 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7hx7w" podUID="66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.942148 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.947449 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"61e55d3d-eab4-4f40-967f-87942bfc05bb","Type":"ContainerDied","Data":"14009cc5ebceaf47c49eb5534447ae51203cbc5039c7c63e6c24c9db2dd580e0"} Nov 24 09:06:17 crc kubenswrapper[4563]: I1124 09:06:17.947523 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14009cc5ebceaf47c49eb5534447ae51203cbc5039c7c63e6c24c9db2dd580e0" Nov 24 09:06:18 crc kubenswrapper[4563]: I1124 09:06:18.238180 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:18 crc kubenswrapper[4563]: I1124 09:06:18.240611 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:18 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:18 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:18 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:18 crc kubenswrapper[4563]: I1124 09:06:18.240687 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:19 crc kubenswrapper[4563]: I1124 09:06:19.241097 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:19 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:19 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:19 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:19 crc kubenswrapper[4563]: I1124 09:06:19.241149 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:20 crc kubenswrapper[4563]: I1124 09:06:20.135773 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kltmd"] Nov 24 09:06:20 crc kubenswrapper[4563]: W1124 09:06:20.146005 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10d7e769_e95c_468d_b220_1fae07708825.slice/crio-be3d98eca2437c649edab4c862648ab229da7d19e7d5681db7f7f8f27c7fd545 WatchSource:0}: Error finding container be3d98eca2437c649edab4c862648ab229da7d19e7d5681db7f7f8f27c7fd545: Status 404 returned error can't find the container with id be3d98eca2437c649edab4c862648ab229da7d19e7d5681db7f7f8f27c7fd545 Nov 24 09:06:20 crc kubenswrapper[4563]: I1124 09:06:20.241437 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:20 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:20 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:20 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:20 crc kubenswrapper[4563]: I1124 09:06:20.241571 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:20 crc kubenswrapper[4563]: I1124 09:06:20.411588 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjn6s"] Nov 24 09:06:20 crc kubenswrapper[4563]: I1124 09:06:20.415732 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 24 09:06:20 crc kubenswrapper[4563]: I1124 09:06:20.962490 4563 generic.go:334] "Generic (PLEG): container finished" podID="a4a8c9f6-6239-4258-8b3a-1d872a211813" containerID="44afe1ed8ab549ef39ef1c146967bee926749a149c398cfb33050193fe2fe8c6" exitCode=0 Nov 24 09:06:20 crc kubenswrapper[4563]: I1124 09:06:20.963317 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjn6s" event={"ID":"a4a8c9f6-6239-4258-8b3a-1d872a211813","Type":"ContainerDied","Data":"44afe1ed8ab549ef39ef1c146967bee926749a149c398cfb33050193fe2fe8c6"} Nov 24 09:06:20 crc kubenswrapper[4563]: I1124 09:06:20.963362 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjn6s" event={"ID":"a4a8c9f6-6239-4258-8b3a-1d872a211813","Type":"ContainerStarted","Data":"2711cb70da62ca0e0e285a86fbbc4e9f8cc047a9e2f85807b0600376be5a7ce7"} Nov 24 09:06:20 crc kubenswrapper[4563]: I1124 09:06:20.967990 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"020dab9d-2ca7-4fc4-b6c1-4094bd50677a","Type":"ContainerStarted","Data":"6bc6040ebb57b5d37eb91123e1954ce6fea2e43593b448b27b6524bebd1a4645"} Nov 24 09:06:20 crc kubenswrapper[4563]: I1124 09:06:20.968022 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"020dab9d-2ca7-4fc4-b6c1-4094bd50677a","Type":"ContainerStarted","Data":"e5da58d562eb533d4767957d84585c4d2180501fc46e2b13b2f964a0e87ffc0e"} Nov 24 09:06:20 crc kubenswrapper[4563]: I1124 09:06:20.972904 4563 generic.go:334] "Generic (PLEG): container finished" podID="10d7e769-e95c-468d-b220-1fae07708825" containerID="e5eb125a1f1333c2c959936f98c7fe61404bcd6a6d702e4bb0e7b8207b8b0ee6" exitCode=0 Nov 24 09:06:20 crc kubenswrapper[4563]: I1124 09:06:20.972945 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kltmd" event={"ID":"10d7e769-e95c-468d-b220-1fae07708825","Type":"ContainerDied","Data":"e5eb125a1f1333c2c959936f98c7fe61404bcd6a6d702e4bb0e7b8207b8b0ee6"} Nov 24 09:06:20 crc kubenswrapper[4563]: I1124 09:06:20.972969 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kltmd" event={"ID":"10d7e769-e95c-468d-b220-1fae07708825","Type":"ContainerStarted","Data":"be3d98eca2437c649edab4c862648ab229da7d19e7d5681db7f7f8f27c7fd545"} Nov 24 09:06:21 crc kubenswrapper[4563]: I1124 09:06:21.019035 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.019015728 podStartE2EDuration="4.019015728s" podCreationTimestamp="2025-11-24 09:06:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:21.017030916 +0000 UTC m=+158.276008362" watchObservedRunningTime="2025-11-24 09:06:21.019015728 +0000 UTC m=+158.277993175" Nov 24 09:06:21 crc kubenswrapper[4563]: I1124 09:06:21.243868 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:21 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:21 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:21 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:21 crc kubenswrapper[4563]: I1124 09:06:21.244174 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:21 crc kubenswrapper[4563]: I1124 09:06:21.979008 4563 generic.go:334] "Generic (PLEG): container finished" podID="020dab9d-2ca7-4fc4-b6c1-4094bd50677a" containerID="6bc6040ebb57b5d37eb91123e1954ce6fea2e43593b448b27b6524bebd1a4645" exitCode=0 Nov 24 09:06:21 crc kubenswrapper[4563]: I1124 09:06:21.979046 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"020dab9d-2ca7-4fc4-b6c1-4094bd50677a","Type":"ContainerDied","Data":"6bc6040ebb57b5d37eb91123e1954ce6fea2e43593b448b27b6524bebd1a4645"} Nov 24 09:06:22 crc kubenswrapper[4563]: I1124 09:06:22.240348 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:22 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:22 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:22 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:22 crc kubenswrapper[4563]: I1124 09:06:22.240401 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:23 crc kubenswrapper[4563]: I1124 09:06:23.194292 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 09:06:23 crc kubenswrapper[4563]: I1124 09:06:23.241137 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:23 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:23 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:23 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:23 crc kubenswrapper[4563]: I1124 09:06:23.241193 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:23 crc kubenswrapper[4563]: I1124 09:06:23.365465 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/020dab9d-2ca7-4fc4-b6c1-4094bd50677a-kubelet-dir\") pod \"020dab9d-2ca7-4fc4-b6c1-4094bd50677a\" (UID: \"020dab9d-2ca7-4fc4-b6c1-4094bd50677a\") " Nov 24 09:06:23 crc kubenswrapper[4563]: I1124 09:06:23.365570 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/020dab9d-2ca7-4fc4-b6c1-4094bd50677a-kube-api-access\") pod \"020dab9d-2ca7-4fc4-b6c1-4094bd50677a\" (UID: \"020dab9d-2ca7-4fc4-b6c1-4094bd50677a\") " Nov 24 09:06:23 crc kubenswrapper[4563]: I1124 09:06:23.365615 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/020dab9d-2ca7-4fc4-b6c1-4094bd50677a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "020dab9d-2ca7-4fc4-b6c1-4094bd50677a" (UID: "020dab9d-2ca7-4fc4-b6c1-4094bd50677a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:06:23 crc kubenswrapper[4563]: I1124 09:06:23.365800 4563 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/020dab9d-2ca7-4fc4-b6c1-4094bd50677a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:23 crc kubenswrapper[4563]: I1124 09:06:23.373398 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020dab9d-2ca7-4fc4-b6c1-4094bd50677a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "020dab9d-2ca7-4fc4-b6c1-4094bd50677a" (UID: "020dab9d-2ca7-4fc4-b6c1-4094bd50677a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:06:23 crc kubenswrapper[4563]: I1124 09:06:23.467046 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/020dab9d-2ca7-4fc4-b6c1-4094bd50677a-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:23 crc kubenswrapper[4563]: I1124 09:06:23.607625 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-glp9q" Nov 24 09:06:23 crc kubenswrapper[4563]: I1124 09:06:23.987997 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"020dab9d-2ca7-4fc4-b6c1-4094bd50677a","Type":"ContainerDied","Data":"e5da58d562eb533d4767957d84585c4d2180501fc46e2b13b2f964a0e87ffc0e"} Nov 24 09:06:23 crc kubenswrapper[4563]: I1124 09:06:23.988044 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 24 09:06:23 crc kubenswrapper[4563]: I1124 09:06:23.988049 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5da58d562eb533d4767957d84585c4d2180501fc46e2b13b2f964a0e87ffc0e" Nov 24 09:06:24 crc kubenswrapper[4563]: I1124 09:06:24.241113 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:24 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:24 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:24 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:24 crc kubenswrapper[4563]: I1124 09:06:24.241175 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:25 crc kubenswrapper[4563]: I1124 09:06:25.240767 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:25 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:25 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:25 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:25 crc kubenswrapper[4563]: I1124 09:06:25.241014 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:26 crc kubenswrapper[4563]: I1124 09:06:26.241414 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:26 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:26 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:26 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:26 crc kubenswrapper[4563]: I1124 09:06:26.241512 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:27 crc kubenswrapper[4563]: I1124 09:06:27.224131 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs\") pod \"network-metrics-daemon-bsfsd\" (UID: \"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\") " pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:06:27 crc kubenswrapper[4563]: I1124 09:06:27.231888 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0-metrics-certs\") pod \"network-metrics-daemon-bsfsd\" (UID: \"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0\") " pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:06:27 crc kubenswrapper[4563]: I1124 09:06:27.241623 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:27 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:27 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:27 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:27 crc kubenswrapper[4563]: I1124 09:06:27.241678 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:27 crc kubenswrapper[4563]: I1124 09:06:27.370357 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bsfsd" Nov 24 09:06:27 crc kubenswrapper[4563]: I1124 09:06:27.825998 4563 patch_prober.go:28] interesting pod/console-f9d7485db-7hx7w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Nov 24 09:06:27 crc kubenswrapper[4563]: I1124 09:06:27.826042 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7hx7w" podUID="66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Nov 24 09:06:28 crc kubenswrapper[4563]: I1124 09:06:28.241231 4563 patch_prober.go:28] interesting pod/router-default-5444994796-r5h7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 24 09:06:28 crc kubenswrapper[4563]: [-]has-synced failed: reason withheld Nov 24 09:06:28 crc kubenswrapper[4563]: [+]process-running ok Nov 24 09:06:28 crc kubenswrapper[4563]: healthz check failed Nov 24 09:06:28 crc kubenswrapper[4563]: I1124 09:06:28.241352 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r5h7f" podUID="fed196e5-1e64-4d16-b63f-297eac90a06d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 24 09:06:29 crc kubenswrapper[4563]: I1124 09:06:29.241752 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:29 crc kubenswrapper[4563]: I1124 09:06:29.244185 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-r5h7f" Nov 24 09:06:31 crc kubenswrapper[4563]: I1124 09:06:31.034996 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xt68x" event={"ID":"d318ef99-9cb4-4f69-81dc-183d64e7532c","Type":"ContainerStarted","Data":"85244fa013dd2f6c286b26aa1190923881f81e2b6f1d9c0acec48d8504b5e009"} Nov 24 09:06:31 crc kubenswrapper[4563]: I1124 09:06:31.044438 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfx45" event={"ID":"0b5917ce-6f12-449f-9637-a94eff40aea4","Type":"ContainerStarted","Data":"07e4fb85fd14e47459abad43e90d1a6876f7ae024bddd6f132a4de95c7dbf7c7"} Nov 24 09:06:31 crc kubenswrapper[4563]: I1124 09:06:31.061667 4563 generic.go:334] "Generic (PLEG): container finished" podID="2298fd99-7188-4326-a55f-9feb9b25d03d" containerID="de58599c7e8804a25145dffe1ecce957d7262e5e946f2b5059f54f6049bb008c" exitCode=0 Nov 24 09:06:31 crc kubenswrapper[4563]: I1124 09:06:31.062274 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jhn4v" event={"ID":"c852825f-7fac-45d3-b801-ae6eb253989a","Type":"ContainerStarted","Data":"5b43dc66373a40d2d90857babe4c6d78d306cd5267139346eda971a3652a0308"} Nov 24 09:06:31 crc kubenswrapper[4563]: I1124 09:06:31.062330 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kn9j4" event={"ID":"2298fd99-7188-4326-a55f-9feb9b25d03d","Type":"ContainerDied","Data":"de58599c7e8804a25145dffe1ecce957d7262e5e946f2b5059f54f6049bb008c"} Nov 24 09:06:31 crc kubenswrapper[4563]: I1124 09:06:31.065379 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp6lh" event={"ID":"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e","Type":"ContainerStarted","Data":"381ff15e3c62d403517b73d2204fa792d5aeb0e7ba0ce5d44e2a59e4e666b5bb"} Nov 24 09:06:31 crc kubenswrapper[4563]: I1124 09:06:31.067592 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fkz2h" event={"ID":"b8c919bf-e04d-4c09-84fa-064f434383bb","Type":"ContainerStarted","Data":"7f57ddaf491675a3c3ae0d2b3ff772f54141c8e9f97e29882570b63a73a651d6"} Nov 24 09:06:31 crc kubenswrapper[4563]: I1124 09:06:31.149535 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bsfsd"] Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.078882 4563 generic.go:334] "Generic (PLEG): container finished" podID="d318ef99-9cb4-4f69-81dc-183d64e7532c" containerID="85244fa013dd2f6c286b26aa1190923881f81e2b6f1d9c0acec48d8504b5e009" exitCode=0 Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.078963 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xt68x" event={"ID":"d318ef99-9cb4-4f69-81dc-183d64e7532c","Type":"ContainerDied","Data":"85244fa013dd2f6c286b26aa1190923881f81e2b6f1d9c0acec48d8504b5e009"} Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.082685 4563 generic.go:334] "Generic (PLEG): container finished" podID="0b5917ce-6f12-449f-9637-a94eff40aea4" containerID="07e4fb85fd14e47459abad43e90d1a6876f7ae024bddd6f132a4de95c7dbf7c7" exitCode=0 Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.083450 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfx45" event={"ID":"0b5917ce-6f12-449f-9637-a94eff40aea4","Type":"ContainerDied","Data":"07e4fb85fd14e47459abad43e90d1a6876f7ae024bddd6f132a4de95c7dbf7c7"} Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.085567 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" event={"ID":"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0","Type":"ContainerStarted","Data":"e64cb10000df898553f41521c5e933267f6e4988df8f4dcad18487abe7d7c5ad"} Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.085595 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" event={"ID":"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0","Type":"ContainerStarted","Data":"6e5d5223b1089ce4891c873f21fed2289f9cb82094547c8151cb911a9cfde1a1"} Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.085618 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bsfsd" event={"ID":"4f3d4be8-448b-47a5-b6b4-e9b9552b3fa0","Type":"ContainerStarted","Data":"83b13c8cdea621bd484759c6ea91b0c615bcd0c645c763f8379b0808ce1defc8"} Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.087215 4563 generic.go:334] "Generic (PLEG): container finished" podID="c852825f-7fac-45d3-b801-ae6eb253989a" containerID="5b43dc66373a40d2d90857babe4c6d78d306cd5267139346eda971a3652a0308" exitCode=0 Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.087243 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jhn4v" event={"ID":"c852825f-7fac-45d3-b801-ae6eb253989a","Type":"ContainerDied","Data":"5b43dc66373a40d2d90857babe4c6d78d306cd5267139346eda971a3652a0308"} Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.105964 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kn9j4" event={"ID":"2298fd99-7188-4326-a55f-9feb9b25d03d","Type":"ContainerStarted","Data":"27a53ae464784842a98a91fdcb2416f605cda39c64eb6f85a202c198552c164d"} Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.108490 4563 generic.go:334] "Generic (PLEG): container finished" podID="41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" containerID="381ff15e3c62d403517b73d2204fa792d5aeb0e7ba0ce5d44e2a59e4e666b5bb" exitCode=0 Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.108569 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp6lh" event={"ID":"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e","Type":"ContainerDied","Data":"381ff15e3c62d403517b73d2204fa792d5aeb0e7ba0ce5d44e2a59e4e666b5bb"} Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.117243 4563 generic.go:334] "Generic (PLEG): container finished" podID="b8c919bf-e04d-4c09-84fa-064f434383bb" containerID="7f57ddaf491675a3c3ae0d2b3ff772f54141c8e9f97e29882570b63a73a651d6" exitCode=0 Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.117294 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fkz2h" event={"ID":"b8c919bf-e04d-4c09-84fa-064f434383bb","Type":"ContainerDied","Data":"7f57ddaf491675a3c3ae0d2b3ff772f54141c8e9f97e29882570b63a73a651d6"} Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.118120 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bsfsd" podStartSLOduration=147.118102898 podStartE2EDuration="2m27.118102898s" podCreationTimestamp="2025-11-24 09:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:06:32.114587357 +0000 UTC m=+169.373564794" watchObservedRunningTime="2025-11-24 09:06:32.118102898 +0000 UTC m=+169.377080344" Nov 24 09:06:32 crc kubenswrapper[4563]: I1124 09:06:32.186458 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kn9j4" podStartSLOduration=1.556461977 podStartE2EDuration="16.186437104s" podCreationTimestamp="2025-11-24 09:06:16 +0000 UTC" firstStartedPulling="2025-11-24 09:06:16.919880207 +0000 UTC m=+154.178857655" lastFinishedPulling="2025-11-24 09:06:31.549855335 +0000 UTC m=+168.808832782" observedRunningTime="2025-11-24 09:06:32.183936297 +0000 UTC m=+169.442913744" watchObservedRunningTime="2025-11-24 09:06:32.186437104 +0000 UTC m=+169.445414552" Nov 24 09:06:33 crc kubenswrapper[4563]: I1124 09:06:33.558593 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:06:35 crc kubenswrapper[4563]: I1124 09:06:35.141056 4563 generic.go:334] "Generic (PLEG): container finished" podID="a4a8c9f6-6239-4258-8b3a-1d872a211813" containerID="fd0bf477a5a3ac7f8380f2f2987d370916d20ab3ed1ccaa2e03c45a43d0b33a5" exitCode=0 Nov 24 09:06:35 crc kubenswrapper[4563]: I1124 09:06:35.141127 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjn6s" event={"ID":"a4a8c9f6-6239-4258-8b3a-1d872a211813","Type":"ContainerDied","Data":"fd0bf477a5a3ac7f8380f2f2987d370916d20ab3ed1ccaa2e03c45a43d0b33a5"} Nov 24 09:06:35 crc kubenswrapper[4563]: I1124 09:06:35.147347 4563 generic.go:334] "Generic (PLEG): container finished" podID="10d7e769-e95c-468d-b220-1fae07708825" containerID="aa308ea2e9a775af023fa218ee0aa86d391c75d709d10dba15393a1eff10c1f4" exitCode=0 Nov 24 09:06:35 crc kubenswrapper[4563]: I1124 09:06:35.147460 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kltmd" event={"ID":"10d7e769-e95c-468d-b220-1fae07708825","Type":"ContainerDied","Data":"aa308ea2e9a775af023fa218ee0aa86d391c75d709d10dba15393a1eff10c1f4"} Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.154601 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fkz2h" event={"ID":"b8c919bf-e04d-4c09-84fa-064f434383bb","Type":"ContainerStarted","Data":"1a7fcfc73514912064910e1c8c5ca88be3b96eaceca810b00db895b968d14343"} Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.157381 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kltmd" event={"ID":"10d7e769-e95c-468d-b220-1fae07708825","Type":"ContainerStarted","Data":"4c9e7f41601e4dc7bea0e711a68fb94187c68eeec6fb26ffb9b367eb60ea769e"} Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.159485 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjn6s" event={"ID":"a4a8c9f6-6239-4258-8b3a-1d872a211813","Type":"ContainerStarted","Data":"ffe0c46478a97bea63b9970fbe76f6cfd3e8c9bfdddec827cb3ee0814ce49389"} Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.161722 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xt68x" event={"ID":"d318ef99-9cb4-4f69-81dc-183d64e7532c","Type":"ContainerStarted","Data":"bf9a1ca62d67bfe8c376dec3af8078565f39c3d4eae3dd2e48b02e2570c66ae6"} Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.163734 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfx45" event={"ID":"0b5917ce-6f12-449f-9637-a94eff40aea4","Type":"ContainerStarted","Data":"97b34bd51fee84078e7971d495526d63950bbe629b1e6d905e463f264041dd87"} Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.165585 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jhn4v" event={"ID":"c852825f-7fac-45d3-b801-ae6eb253989a","Type":"ContainerStarted","Data":"bd1b821eb6953f3bee8ea0a55411fe55393dd64decd840c91a41df3c982c6924"} Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.167455 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp6lh" event={"ID":"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e","Type":"ContainerStarted","Data":"2037f7894bd1ebe2835c26ea5f66c5ca7178fef508a345ba186f128d502f0fe5"} Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.173150 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fkz2h" podStartSLOduration=2.504239831 podStartE2EDuration="23.173139329s" podCreationTimestamp="2025-11-24 09:06:13 +0000 UTC" firstStartedPulling="2025-11-24 09:06:14.892702715 +0000 UTC m=+152.151680162" lastFinishedPulling="2025-11-24 09:06:35.561602212 +0000 UTC m=+172.820579660" observedRunningTime="2025-11-24 09:06:36.170032549 +0000 UTC m=+173.429009996" watchObservedRunningTime="2025-11-24 09:06:36.173139329 +0000 UTC m=+173.432116775" Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.185705 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zjn6s" podStartSLOduration=4.406499333 podStartE2EDuration="19.18569652s" podCreationTimestamp="2025-11-24 09:06:17 +0000 UTC" firstStartedPulling="2025-11-24 09:06:20.964731229 +0000 UTC m=+158.223708676" lastFinishedPulling="2025-11-24 09:06:35.743928415 +0000 UTC m=+173.002905863" observedRunningTime="2025-11-24 09:06:36.184956795 +0000 UTC m=+173.443934242" watchObservedRunningTime="2025-11-24 09:06:36.18569652 +0000 UTC m=+173.444673967" Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.203739 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tfx45" podStartSLOduration=2.782087824 podStartE2EDuration="22.203720994s" podCreationTimestamp="2025-11-24 09:06:14 +0000 UTC" firstStartedPulling="2025-11-24 09:06:15.91023025 +0000 UTC m=+153.169207697" lastFinishedPulling="2025-11-24 09:06:35.33186342 +0000 UTC m=+172.590840867" observedRunningTime="2025-11-24 09:06:36.200433434 +0000 UTC m=+173.459410881" watchObservedRunningTime="2025-11-24 09:06:36.203720994 +0000 UTC m=+173.462698442" Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.221334 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jhn4v" podStartSLOduration=2.007408878 podStartE2EDuration="22.221322461s" podCreationTimestamp="2025-11-24 09:06:14 +0000 UTC" firstStartedPulling="2025-11-24 09:06:14.886504374 +0000 UTC m=+152.145481822" lastFinishedPulling="2025-11-24 09:06:35.100417959 +0000 UTC m=+172.359395405" observedRunningTime="2025-11-24 09:06:36.218219178 +0000 UTC m=+173.477196625" watchObservedRunningTime="2025-11-24 09:06:36.221322461 +0000 UTC m=+173.480299908" Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.239687 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kltmd" podStartSLOduration=4.454694088 podStartE2EDuration="19.239665895s" podCreationTimestamp="2025-11-24 09:06:17 +0000 UTC" firstStartedPulling="2025-11-24 09:06:20.974787354 +0000 UTC m=+158.233764802" lastFinishedPulling="2025-11-24 09:06:35.759759162 +0000 UTC m=+173.018736609" observedRunningTime="2025-11-24 09:06:36.234386028 +0000 UTC m=+173.493363475" watchObservedRunningTime="2025-11-24 09:06:36.239665895 +0000 UTC m=+173.498643342" Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.262352 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wp6lh" podStartSLOduration=2.612891921 podStartE2EDuration="21.262335319s" podCreationTimestamp="2025-11-24 09:06:15 +0000 UTC" firstStartedPulling="2025-11-24 09:06:16.924104815 +0000 UTC m=+154.183082262" lastFinishedPulling="2025-11-24 09:06:35.573548212 +0000 UTC m=+172.832525660" observedRunningTime="2025-11-24 09:06:36.259652851 +0000 UTC m=+173.518630297" watchObservedRunningTime="2025-11-24 09:06:36.262335319 +0000 UTC m=+173.521312767" Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.548307 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.548453 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.661044 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:36 crc kubenswrapper[4563]: I1124 09:06:36.684544 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xt68x" podStartSLOduration=2.873287785 podStartE2EDuration="22.684527425s" podCreationTimestamp="2025-11-24 09:06:14 +0000 UTC" firstStartedPulling="2025-11-24 09:06:15.905147493 +0000 UTC m=+153.164124940" lastFinishedPulling="2025-11-24 09:06:35.716387133 +0000 UTC m=+172.975364580" observedRunningTime="2025-11-24 09:06:36.280159065 +0000 UTC m=+173.539136512" watchObservedRunningTime="2025-11-24 09:06:36.684527425 +0000 UTC m=+173.943504872" Nov 24 09:06:37 crc kubenswrapper[4563]: I1124 09:06:37.208108 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:37 crc kubenswrapper[4563]: I1124 09:06:37.347750 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:37 crc kubenswrapper[4563]: I1124 09:06:37.347809 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:37 crc kubenswrapper[4563]: I1124 09:06:37.753480 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:37 crc kubenswrapper[4563]: I1124 09:06:37.753549 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:37 crc kubenswrapper[4563]: I1124 09:06:37.833237 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:37 crc kubenswrapper[4563]: I1124 09:06:37.839183 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:06:38 crc kubenswrapper[4563]: I1124 09:06:38.385110 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kltmd" podUID="10d7e769-e95c-468d-b220-1fae07708825" containerName="registry-server" probeResult="failure" output=< Nov 24 09:06:38 crc kubenswrapper[4563]: timeout: failed to connect service ":50051" within 1s Nov 24 09:06:38 crc kubenswrapper[4563]: > Nov 24 09:06:38 crc kubenswrapper[4563]: I1124 09:06:38.784722 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zjn6s" podUID="a4a8c9f6-6239-4258-8b3a-1d872a211813" containerName="registry-server" probeResult="failure" output=< Nov 24 09:06:38 crc kubenswrapper[4563]: timeout: failed to connect service ":50051" within 1s Nov 24 09:06:38 crc kubenswrapper[4563]: > Nov 24 09:06:38 crc kubenswrapper[4563]: I1124 09:06:38.988229 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:06:38 crc kubenswrapper[4563]: I1124 09:06:38.988321 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:06:39 crc kubenswrapper[4563]: I1124 09:06:39.610330 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kn9j4"] Nov 24 09:06:40 crc kubenswrapper[4563]: I1124 09:06:40.189626 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kn9j4" podUID="2298fd99-7188-4326-a55f-9feb9b25d03d" containerName="registry-server" containerID="cri-o://27a53ae464784842a98a91fdcb2416f605cda39c64eb6f85a202c198552c164d" gracePeriod=2 Nov 24 09:06:40 crc kubenswrapper[4563]: I1124 09:06:40.600363 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:40 crc kubenswrapper[4563]: I1124 09:06:40.626198 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hsrm\" (UniqueName: \"kubernetes.io/projected/2298fd99-7188-4326-a55f-9feb9b25d03d-kube-api-access-2hsrm\") pod \"2298fd99-7188-4326-a55f-9feb9b25d03d\" (UID: \"2298fd99-7188-4326-a55f-9feb9b25d03d\") " Nov 24 09:06:40 crc kubenswrapper[4563]: I1124 09:06:40.626406 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2298fd99-7188-4326-a55f-9feb9b25d03d-utilities\") pod \"2298fd99-7188-4326-a55f-9feb9b25d03d\" (UID: \"2298fd99-7188-4326-a55f-9feb9b25d03d\") " Nov 24 09:06:40 crc kubenswrapper[4563]: I1124 09:06:40.626459 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2298fd99-7188-4326-a55f-9feb9b25d03d-catalog-content\") pod \"2298fd99-7188-4326-a55f-9feb9b25d03d\" (UID: \"2298fd99-7188-4326-a55f-9feb9b25d03d\") " Nov 24 09:06:40 crc kubenswrapper[4563]: I1124 09:06:40.626987 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2298fd99-7188-4326-a55f-9feb9b25d03d-utilities" (OuterVolumeSpecName: "utilities") pod "2298fd99-7188-4326-a55f-9feb9b25d03d" (UID: "2298fd99-7188-4326-a55f-9feb9b25d03d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:06:40 crc kubenswrapper[4563]: I1124 09:06:40.631073 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2298fd99-7188-4326-a55f-9feb9b25d03d-kube-api-access-2hsrm" (OuterVolumeSpecName: "kube-api-access-2hsrm") pod "2298fd99-7188-4326-a55f-9feb9b25d03d" (UID: "2298fd99-7188-4326-a55f-9feb9b25d03d"). InnerVolumeSpecName "kube-api-access-2hsrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:06:40 crc kubenswrapper[4563]: I1124 09:06:40.642300 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2298fd99-7188-4326-a55f-9feb9b25d03d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2298fd99-7188-4326-a55f-9feb9b25d03d" (UID: "2298fd99-7188-4326-a55f-9feb9b25d03d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:06:40 crc kubenswrapper[4563]: I1124 09:06:40.728094 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hsrm\" (UniqueName: \"kubernetes.io/projected/2298fd99-7188-4326-a55f-9feb9b25d03d-kube-api-access-2hsrm\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:40 crc kubenswrapper[4563]: I1124 09:06:40.728123 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2298fd99-7188-4326-a55f-9feb9b25d03d-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:40 crc kubenswrapper[4563]: I1124 09:06:40.728133 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2298fd99-7188-4326-a55f-9feb9b25d03d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.198066 4563 generic.go:334] "Generic (PLEG): container finished" podID="2298fd99-7188-4326-a55f-9feb9b25d03d" containerID="27a53ae464784842a98a91fdcb2416f605cda39c64eb6f85a202c198552c164d" exitCode=0 Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.198151 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kn9j4" Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.198198 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kn9j4" event={"ID":"2298fd99-7188-4326-a55f-9feb9b25d03d","Type":"ContainerDied","Data":"27a53ae464784842a98a91fdcb2416f605cda39c64eb6f85a202c198552c164d"} Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.198687 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kn9j4" event={"ID":"2298fd99-7188-4326-a55f-9feb9b25d03d","Type":"ContainerDied","Data":"49108e687fcd1363e94c5c88e9ab14f0d51a638cb4d234d0f0fdaec2e62a843e"} Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.198718 4563 scope.go:117] "RemoveContainer" containerID="27a53ae464784842a98a91fdcb2416f605cda39c64eb6f85a202c198552c164d" Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.215789 4563 scope.go:117] "RemoveContainer" containerID="de58599c7e8804a25145dffe1ecce957d7262e5e946f2b5059f54f6049bb008c" Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.217718 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kn9j4"] Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.219660 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kn9j4"] Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.229918 4563 scope.go:117] "RemoveContainer" containerID="344313d4c7a3a6330cc0866ff3c8bfcfaab4a7c0b92194708b363717ec4c65bd" Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.249805 4563 scope.go:117] "RemoveContainer" containerID="27a53ae464784842a98a91fdcb2416f605cda39c64eb6f85a202c198552c164d" Nov 24 09:06:41 crc kubenswrapper[4563]: E1124 09:06:41.250165 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27a53ae464784842a98a91fdcb2416f605cda39c64eb6f85a202c198552c164d\": container with ID starting with 27a53ae464784842a98a91fdcb2416f605cda39c64eb6f85a202c198552c164d not found: ID does not exist" containerID="27a53ae464784842a98a91fdcb2416f605cda39c64eb6f85a202c198552c164d" Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.250208 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a53ae464784842a98a91fdcb2416f605cda39c64eb6f85a202c198552c164d"} err="failed to get container status \"27a53ae464784842a98a91fdcb2416f605cda39c64eb6f85a202c198552c164d\": rpc error: code = NotFound desc = could not find container \"27a53ae464784842a98a91fdcb2416f605cda39c64eb6f85a202c198552c164d\": container with ID starting with 27a53ae464784842a98a91fdcb2416f605cda39c64eb6f85a202c198552c164d not found: ID does not exist" Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.250261 4563 scope.go:117] "RemoveContainer" containerID="de58599c7e8804a25145dffe1ecce957d7262e5e946f2b5059f54f6049bb008c" Nov 24 09:06:41 crc kubenswrapper[4563]: E1124 09:06:41.250609 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de58599c7e8804a25145dffe1ecce957d7262e5e946f2b5059f54f6049bb008c\": container with ID starting with de58599c7e8804a25145dffe1ecce957d7262e5e946f2b5059f54f6049bb008c not found: ID does not exist" containerID="de58599c7e8804a25145dffe1ecce957d7262e5e946f2b5059f54f6049bb008c" Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.250664 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de58599c7e8804a25145dffe1ecce957d7262e5e946f2b5059f54f6049bb008c"} err="failed to get container status \"de58599c7e8804a25145dffe1ecce957d7262e5e946f2b5059f54f6049bb008c\": rpc error: code = NotFound desc = could not find container \"de58599c7e8804a25145dffe1ecce957d7262e5e946f2b5059f54f6049bb008c\": container with ID starting with de58599c7e8804a25145dffe1ecce957d7262e5e946f2b5059f54f6049bb008c not found: ID does not exist" Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.250684 4563 scope.go:117] "RemoveContainer" containerID="344313d4c7a3a6330cc0866ff3c8bfcfaab4a7c0b92194708b363717ec4c65bd" Nov 24 09:06:41 crc kubenswrapper[4563]: E1124 09:06:41.250964 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"344313d4c7a3a6330cc0866ff3c8bfcfaab4a7c0b92194708b363717ec4c65bd\": container with ID starting with 344313d4c7a3a6330cc0866ff3c8bfcfaab4a7c0b92194708b363717ec4c65bd not found: ID does not exist" containerID="344313d4c7a3a6330cc0866ff3c8bfcfaab4a7c0b92194708b363717ec4c65bd" Nov 24 09:06:41 crc kubenswrapper[4563]: I1124 09:06:41.250989 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"344313d4c7a3a6330cc0866ff3c8bfcfaab4a7c0b92194708b363717ec4c65bd"} err="failed to get container status \"344313d4c7a3a6330cc0866ff3c8bfcfaab4a7c0b92194708b363717ec4c65bd\": rpc error: code = NotFound desc = could not find container \"344313d4c7a3a6330cc0866ff3c8bfcfaab4a7c0b92194708b363717ec4c65bd\": container with ID starting with 344313d4c7a3a6330cc0866ff3c8bfcfaab4a7c0b92194708b363717ec4c65bd not found: ID does not exist" Nov 24 09:06:43 crc kubenswrapper[4563]: I1124 09:06:43.039515 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nnx6h"] Nov 24 09:06:43 crc kubenswrapper[4563]: I1124 09:06:43.079244 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2298fd99-7188-4326-a55f-9feb9b25d03d" path="/var/lib/kubelet/pods/2298fd99-7188-4326-a55f-9feb9b25d03d/volumes" Nov 24 09:06:44 crc kubenswrapper[4563]: I1124 09:06:44.144798 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:44 crc kubenswrapper[4563]: I1124 09:06:44.145077 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:44 crc kubenswrapper[4563]: I1124 09:06:44.176402 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:44 crc kubenswrapper[4563]: I1124 09:06:44.247759 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:06:44 crc kubenswrapper[4563]: I1124 09:06:44.333403 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:44 crc kubenswrapper[4563]: I1124 09:06:44.333444 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:44 crc kubenswrapper[4563]: I1124 09:06:44.371504 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:44 crc kubenswrapper[4563]: I1124 09:06:44.544242 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:44 crc kubenswrapper[4563]: I1124 09:06:44.544284 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:44 crc kubenswrapper[4563]: I1124 09:06:44.577177 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:44 crc kubenswrapper[4563]: I1124 09:06:44.752526 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:44 crc kubenswrapper[4563]: I1124 09:06:44.752563 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:44 crc kubenswrapper[4563]: I1124 09:06:44.794994 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:45 crc kubenswrapper[4563]: I1124 09:06:45.253408 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:06:45 crc kubenswrapper[4563]: I1124 09:06:45.253894 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:45 crc kubenswrapper[4563]: I1124 09:06:45.254370 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:46 crc kubenswrapper[4563]: I1124 09:06:46.143946 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:46 crc kubenswrapper[4563]: I1124 09:06:46.144007 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:46 crc kubenswrapper[4563]: I1124 09:06:46.177530 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:46 crc kubenswrapper[4563]: I1124 09:06:46.260372 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:06:46 crc kubenswrapper[4563]: I1124 09:06:46.613748 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xt68x"] Nov 24 09:06:46 crc kubenswrapper[4563]: I1124 09:06:46.813492 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tfx45"] Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.234184 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tfx45" podUID="0b5917ce-6f12-449f-9637-a94eff40aea4" containerName="registry-server" containerID="cri-o://97b34bd51fee84078e7971d495526d63950bbe629b1e6d905e463f264041dd87" gracePeriod=2 Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.379798 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.413534 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.688549 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.734669 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5917ce-6f12-449f-9637-a94eff40aea4-catalog-content\") pod \"0b5917ce-6f12-449f-9637-a94eff40aea4\" (UID: \"0b5917ce-6f12-449f-9637-a94eff40aea4\") " Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.734773 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf5fr\" (UniqueName: \"kubernetes.io/projected/0b5917ce-6f12-449f-9637-a94eff40aea4-kube-api-access-gf5fr\") pod \"0b5917ce-6f12-449f-9637-a94eff40aea4\" (UID: \"0b5917ce-6f12-449f-9637-a94eff40aea4\") " Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.734935 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5917ce-6f12-449f-9637-a94eff40aea4-utilities\") pod \"0b5917ce-6f12-449f-9637-a94eff40aea4\" (UID: \"0b5917ce-6f12-449f-9637-a94eff40aea4\") " Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.735667 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b5917ce-6f12-449f-9637-a94eff40aea4-utilities" (OuterVolumeSpecName: "utilities") pod "0b5917ce-6f12-449f-9637-a94eff40aea4" (UID: "0b5917ce-6f12-449f-9637-a94eff40aea4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.741918 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b5917ce-6f12-449f-9637-a94eff40aea4-kube-api-access-gf5fr" (OuterVolumeSpecName: "kube-api-access-gf5fr") pod "0b5917ce-6f12-449f-9637-a94eff40aea4" (UID: "0b5917ce-6f12-449f-9637-a94eff40aea4"). InnerVolumeSpecName "kube-api-access-gf5fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.771971 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xjp4v" Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.776788 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b5917ce-6f12-449f-9637-a94eff40aea4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b5917ce-6f12-449f-9637-a94eff40aea4" (UID: "0b5917ce-6f12-449f-9637-a94eff40aea4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.799539 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.836425 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf5fr\" (UniqueName: \"kubernetes.io/projected/0b5917ce-6f12-449f-9637-a94eff40aea4-kube-api-access-gf5fr\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.836458 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5917ce-6f12-449f-9637-a94eff40aea4-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.836469 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5917ce-6f12-449f-9637-a94eff40aea4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:47 crc kubenswrapper[4563]: I1124 09:06:47.841323 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.192553 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.240757 4563 generic.go:334] "Generic (PLEG): container finished" podID="0b5917ce-6f12-449f-9637-a94eff40aea4" containerID="97b34bd51fee84078e7971d495526d63950bbe629b1e6d905e463f264041dd87" exitCode=0 Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.240815 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfx45" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.240829 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfx45" event={"ID":"0b5917ce-6f12-449f-9637-a94eff40aea4","Type":"ContainerDied","Data":"97b34bd51fee84078e7971d495526d63950bbe629b1e6d905e463f264041dd87"} Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.241259 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfx45" event={"ID":"0b5917ce-6f12-449f-9637-a94eff40aea4","Type":"ContainerDied","Data":"0056b4a985b44a58e4d8ad67018ec9759a5d7a6747437998da2e148c2a952c31"} Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.241361 4563 scope.go:117] "RemoveContainer" containerID="97b34bd51fee84078e7971d495526d63950bbe629b1e6d905e463f264041dd87" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.241389 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xt68x" podUID="d318ef99-9cb4-4f69-81dc-183d64e7532c" containerName="registry-server" containerID="cri-o://bf9a1ca62d67bfe8c376dec3af8078565f39c3d4eae3dd2e48b02e2570c66ae6" gracePeriod=2 Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.265026 4563 scope.go:117] "RemoveContainer" containerID="07e4fb85fd14e47459abad43e90d1a6876f7ae024bddd6f132a4de95c7dbf7c7" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.276721 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tfx45"] Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.279181 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tfx45"] Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.336083 4563 scope.go:117] "RemoveContainer" containerID="14928ac4a27e070cd1b7b18ad08dd055f9c1644d22d932deab3fbb3dda843b3a" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.350615 4563 scope.go:117] "RemoveContainer" containerID="97b34bd51fee84078e7971d495526d63950bbe629b1e6d905e463f264041dd87" Nov 24 09:06:48 crc kubenswrapper[4563]: E1124 09:06:48.351031 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97b34bd51fee84078e7971d495526d63950bbe629b1e6d905e463f264041dd87\": container with ID starting with 97b34bd51fee84078e7971d495526d63950bbe629b1e6d905e463f264041dd87 not found: ID does not exist" containerID="97b34bd51fee84078e7971d495526d63950bbe629b1e6d905e463f264041dd87" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.351070 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97b34bd51fee84078e7971d495526d63950bbe629b1e6d905e463f264041dd87"} err="failed to get container status \"97b34bd51fee84078e7971d495526d63950bbe629b1e6d905e463f264041dd87\": rpc error: code = NotFound desc = could not find container \"97b34bd51fee84078e7971d495526d63950bbe629b1e6d905e463f264041dd87\": container with ID starting with 97b34bd51fee84078e7971d495526d63950bbe629b1e6d905e463f264041dd87 not found: ID does not exist" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.351096 4563 scope.go:117] "RemoveContainer" containerID="07e4fb85fd14e47459abad43e90d1a6876f7ae024bddd6f132a4de95c7dbf7c7" Nov 24 09:06:48 crc kubenswrapper[4563]: E1124 09:06:48.351492 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07e4fb85fd14e47459abad43e90d1a6876f7ae024bddd6f132a4de95c7dbf7c7\": container with ID starting with 07e4fb85fd14e47459abad43e90d1a6876f7ae024bddd6f132a4de95c7dbf7c7 not found: ID does not exist" containerID="07e4fb85fd14e47459abad43e90d1a6876f7ae024bddd6f132a4de95c7dbf7c7" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.351521 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07e4fb85fd14e47459abad43e90d1a6876f7ae024bddd6f132a4de95c7dbf7c7"} err="failed to get container status \"07e4fb85fd14e47459abad43e90d1a6876f7ae024bddd6f132a4de95c7dbf7c7\": rpc error: code = NotFound desc = could not find container \"07e4fb85fd14e47459abad43e90d1a6876f7ae024bddd6f132a4de95c7dbf7c7\": container with ID starting with 07e4fb85fd14e47459abad43e90d1a6876f7ae024bddd6f132a4de95c7dbf7c7 not found: ID does not exist" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.351536 4563 scope.go:117] "RemoveContainer" containerID="14928ac4a27e070cd1b7b18ad08dd055f9c1644d22d932deab3fbb3dda843b3a" Nov 24 09:06:48 crc kubenswrapper[4563]: E1124 09:06:48.351844 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14928ac4a27e070cd1b7b18ad08dd055f9c1644d22d932deab3fbb3dda843b3a\": container with ID starting with 14928ac4a27e070cd1b7b18ad08dd055f9c1644d22d932deab3fbb3dda843b3a not found: ID does not exist" containerID="14928ac4a27e070cd1b7b18ad08dd055f9c1644d22d932deab3fbb3dda843b3a" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.351867 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14928ac4a27e070cd1b7b18ad08dd055f9c1644d22d932deab3fbb3dda843b3a"} err="failed to get container status \"14928ac4a27e070cd1b7b18ad08dd055f9c1644d22d932deab3fbb3dda843b3a\": rpc error: code = NotFound desc = could not find container \"14928ac4a27e070cd1b7b18ad08dd055f9c1644d22d932deab3fbb3dda843b3a\": container with ID starting with 14928ac4a27e070cd1b7b18ad08dd055f9c1644d22d932deab3fbb3dda843b3a not found: ID does not exist" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.761524 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.847740 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d318ef99-9cb4-4f69-81dc-183d64e7532c-utilities\") pod \"d318ef99-9cb4-4f69-81dc-183d64e7532c\" (UID: \"d318ef99-9cb4-4f69-81dc-183d64e7532c\") " Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.847800 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp5rh\" (UniqueName: \"kubernetes.io/projected/d318ef99-9cb4-4f69-81dc-183d64e7532c-kube-api-access-xp5rh\") pod \"d318ef99-9cb4-4f69-81dc-183d64e7532c\" (UID: \"d318ef99-9cb4-4f69-81dc-183d64e7532c\") " Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.847911 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d318ef99-9cb4-4f69-81dc-183d64e7532c-catalog-content\") pod \"d318ef99-9cb4-4f69-81dc-183d64e7532c\" (UID: \"d318ef99-9cb4-4f69-81dc-183d64e7532c\") " Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.848460 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d318ef99-9cb4-4f69-81dc-183d64e7532c-utilities" (OuterVolumeSpecName: "utilities") pod "d318ef99-9cb4-4f69-81dc-183d64e7532c" (UID: "d318ef99-9cb4-4f69-81dc-183d64e7532c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.853086 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d318ef99-9cb4-4f69-81dc-183d64e7532c-kube-api-access-xp5rh" (OuterVolumeSpecName: "kube-api-access-xp5rh") pod "d318ef99-9cb4-4f69-81dc-183d64e7532c" (UID: "d318ef99-9cb4-4f69-81dc-183d64e7532c"). InnerVolumeSpecName "kube-api-access-xp5rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.889066 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d318ef99-9cb4-4f69-81dc-183d64e7532c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d318ef99-9cb4-4f69-81dc-183d64e7532c" (UID: "d318ef99-9cb4-4f69-81dc-183d64e7532c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.949065 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d318ef99-9cb4-4f69-81dc-183d64e7532c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.949099 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d318ef99-9cb4-4f69-81dc-183d64e7532c-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:48 crc kubenswrapper[4563]: I1124 09:06:48.949118 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp5rh\" (UniqueName: \"kubernetes.io/projected/d318ef99-9cb4-4f69-81dc-183d64e7532c-kube-api-access-xp5rh\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.060290 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b5917ce-6f12-449f-9637-a94eff40aea4" path="/var/lib/kubelet/pods/0b5917ce-6f12-449f-9637-a94eff40aea4/volumes" Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.248230 4563 generic.go:334] "Generic (PLEG): container finished" podID="d318ef99-9cb4-4f69-81dc-183d64e7532c" containerID="bf9a1ca62d67bfe8c376dec3af8078565f39c3d4eae3dd2e48b02e2570c66ae6" exitCode=0 Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.248284 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xt68x" Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.248304 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xt68x" event={"ID":"d318ef99-9cb4-4f69-81dc-183d64e7532c","Type":"ContainerDied","Data":"bf9a1ca62d67bfe8c376dec3af8078565f39c3d4eae3dd2e48b02e2570c66ae6"} Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.248371 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xt68x" event={"ID":"d318ef99-9cb4-4f69-81dc-183d64e7532c","Type":"ContainerDied","Data":"285fbf988eab26feac7c73a15fab0dbd0a44f239dc91c77be11f433edaafda7f"} Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.248405 4563 scope.go:117] "RemoveContainer" containerID="bf9a1ca62d67bfe8c376dec3af8078565f39c3d4eae3dd2e48b02e2570c66ae6" Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.263858 4563 scope.go:117] "RemoveContainer" containerID="85244fa013dd2f6c286b26aa1190923881f81e2b6f1d9c0acec48d8504b5e009" Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.266455 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xt68x"] Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.273589 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xt68x"] Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.278810 4563 scope.go:117] "RemoveContainer" containerID="4fc5e911f87a388fc9cbf7e6d876870d4c4a4faae0b248dee87f038e49271ad5" Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.294590 4563 scope.go:117] "RemoveContainer" containerID="bf9a1ca62d67bfe8c376dec3af8078565f39c3d4eae3dd2e48b02e2570c66ae6" Nov 24 09:06:49 crc kubenswrapper[4563]: E1124 09:06:49.295198 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf9a1ca62d67bfe8c376dec3af8078565f39c3d4eae3dd2e48b02e2570c66ae6\": container with ID starting with bf9a1ca62d67bfe8c376dec3af8078565f39c3d4eae3dd2e48b02e2570c66ae6 not found: ID does not exist" containerID="bf9a1ca62d67bfe8c376dec3af8078565f39c3d4eae3dd2e48b02e2570c66ae6" Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.295229 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf9a1ca62d67bfe8c376dec3af8078565f39c3d4eae3dd2e48b02e2570c66ae6"} err="failed to get container status \"bf9a1ca62d67bfe8c376dec3af8078565f39c3d4eae3dd2e48b02e2570c66ae6\": rpc error: code = NotFound desc = could not find container \"bf9a1ca62d67bfe8c376dec3af8078565f39c3d4eae3dd2e48b02e2570c66ae6\": container with ID starting with bf9a1ca62d67bfe8c376dec3af8078565f39c3d4eae3dd2e48b02e2570c66ae6 not found: ID does not exist" Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.295251 4563 scope.go:117] "RemoveContainer" containerID="85244fa013dd2f6c286b26aa1190923881f81e2b6f1d9c0acec48d8504b5e009" Nov 24 09:06:49 crc kubenswrapper[4563]: E1124 09:06:49.295581 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85244fa013dd2f6c286b26aa1190923881f81e2b6f1d9c0acec48d8504b5e009\": container with ID starting with 85244fa013dd2f6c286b26aa1190923881f81e2b6f1d9c0acec48d8504b5e009 not found: ID does not exist" containerID="85244fa013dd2f6c286b26aa1190923881f81e2b6f1d9c0acec48d8504b5e009" Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.295604 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85244fa013dd2f6c286b26aa1190923881f81e2b6f1d9c0acec48d8504b5e009"} err="failed to get container status \"85244fa013dd2f6c286b26aa1190923881f81e2b6f1d9c0acec48d8504b5e009\": rpc error: code = NotFound desc = could not find container \"85244fa013dd2f6c286b26aa1190923881f81e2b6f1d9c0acec48d8504b5e009\": container with ID starting with 85244fa013dd2f6c286b26aa1190923881f81e2b6f1d9c0acec48d8504b5e009 not found: ID does not exist" Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.295618 4563 scope.go:117] "RemoveContainer" containerID="4fc5e911f87a388fc9cbf7e6d876870d4c4a4faae0b248dee87f038e49271ad5" Nov 24 09:06:49 crc kubenswrapper[4563]: E1124 09:06:49.295996 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fc5e911f87a388fc9cbf7e6d876870d4c4a4faae0b248dee87f038e49271ad5\": container with ID starting with 4fc5e911f87a388fc9cbf7e6d876870d4c4a4faae0b248dee87f038e49271ad5 not found: ID does not exist" containerID="4fc5e911f87a388fc9cbf7e6d876870d4c4a4faae0b248dee87f038e49271ad5" Nov 24 09:06:49 crc kubenswrapper[4563]: I1124 09:06:49.296024 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fc5e911f87a388fc9cbf7e6d876870d4c4a4faae0b248dee87f038e49271ad5"} err="failed to get container status \"4fc5e911f87a388fc9cbf7e6d876870d4c4a4faae0b248dee87f038e49271ad5\": rpc error: code = NotFound desc = could not find container \"4fc5e911f87a388fc9cbf7e6d876870d4c4a4faae0b248dee87f038e49271ad5\": container with ID starting with 4fc5e911f87a388fc9cbf7e6d876870d4c4a4faae0b248dee87f038e49271ad5 not found: ID does not exist" Nov 24 09:06:51 crc kubenswrapper[4563]: I1124 09:06:51.060266 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d318ef99-9cb4-4f69-81dc-183d64e7532c" path="/var/lib/kubelet/pods/d318ef99-9cb4-4f69-81dc-183d64e7532c/volumes" Nov 24 09:06:51 crc kubenswrapper[4563]: I1124 09:06:51.215875 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zjn6s"] Nov 24 09:06:51 crc kubenswrapper[4563]: I1124 09:06:51.216102 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zjn6s" podUID="a4a8c9f6-6239-4258-8b3a-1d872a211813" containerName="registry-server" containerID="cri-o://ffe0c46478a97bea63b9970fbe76f6cfd3e8c9bfdddec827cb3ee0814ce49389" gracePeriod=2 Nov 24 09:06:51 crc kubenswrapper[4563]: I1124 09:06:51.610410 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:51 crc kubenswrapper[4563]: I1124 09:06:51.684003 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a8c9f6-6239-4258-8b3a-1d872a211813-catalog-content\") pod \"a4a8c9f6-6239-4258-8b3a-1d872a211813\" (UID: \"a4a8c9f6-6239-4258-8b3a-1d872a211813\") " Nov 24 09:06:51 crc kubenswrapper[4563]: I1124 09:06:51.684079 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn9st\" (UniqueName: \"kubernetes.io/projected/a4a8c9f6-6239-4258-8b3a-1d872a211813-kube-api-access-gn9st\") pod \"a4a8c9f6-6239-4258-8b3a-1d872a211813\" (UID: \"a4a8c9f6-6239-4258-8b3a-1d872a211813\") " Nov 24 09:06:51 crc kubenswrapper[4563]: I1124 09:06:51.684106 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a8c9f6-6239-4258-8b3a-1d872a211813-utilities\") pod \"a4a8c9f6-6239-4258-8b3a-1d872a211813\" (UID: \"a4a8c9f6-6239-4258-8b3a-1d872a211813\") " Nov 24 09:06:51 crc kubenswrapper[4563]: I1124 09:06:51.684799 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a8c9f6-6239-4258-8b3a-1d872a211813-utilities" (OuterVolumeSpecName: "utilities") pod "a4a8c9f6-6239-4258-8b3a-1d872a211813" (UID: "a4a8c9f6-6239-4258-8b3a-1d872a211813"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:06:51 crc kubenswrapper[4563]: I1124 09:06:51.688360 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a8c9f6-6239-4258-8b3a-1d872a211813-kube-api-access-gn9st" (OuterVolumeSpecName: "kube-api-access-gn9st") pod "a4a8c9f6-6239-4258-8b3a-1d872a211813" (UID: "a4a8c9f6-6239-4258-8b3a-1d872a211813"). InnerVolumeSpecName "kube-api-access-gn9st". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:06:51 crc kubenswrapper[4563]: I1124 09:06:51.745496 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a8c9f6-6239-4258-8b3a-1d872a211813-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4a8c9f6-6239-4258-8b3a-1d872a211813" (UID: "a4a8c9f6-6239-4258-8b3a-1d872a211813"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:06:51 crc kubenswrapper[4563]: I1124 09:06:51.785908 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a8c9f6-6239-4258-8b3a-1d872a211813-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:51 crc kubenswrapper[4563]: I1124 09:06:51.785946 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn9st\" (UniqueName: \"kubernetes.io/projected/a4a8c9f6-6239-4258-8b3a-1d872a211813-kube-api-access-gn9st\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:51 crc kubenswrapper[4563]: I1124 09:06:51.785958 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a8c9f6-6239-4258-8b3a-1d872a211813-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.268803 4563 generic.go:334] "Generic (PLEG): container finished" podID="a4a8c9f6-6239-4258-8b3a-1d872a211813" containerID="ffe0c46478a97bea63b9970fbe76f6cfd3e8c9bfdddec827cb3ee0814ce49389" exitCode=0 Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.268849 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjn6s" Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.268866 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjn6s" event={"ID":"a4a8c9f6-6239-4258-8b3a-1d872a211813","Type":"ContainerDied","Data":"ffe0c46478a97bea63b9970fbe76f6cfd3e8c9bfdddec827cb3ee0814ce49389"} Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.269252 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjn6s" event={"ID":"a4a8c9f6-6239-4258-8b3a-1d872a211813","Type":"ContainerDied","Data":"2711cb70da62ca0e0e285a86fbbc4e9f8cc047a9e2f85807b0600376be5a7ce7"} Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.269270 4563 scope.go:117] "RemoveContainer" containerID="ffe0c46478a97bea63b9970fbe76f6cfd3e8c9bfdddec827cb3ee0814ce49389" Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.295737 4563 scope.go:117] "RemoveContainer" containerID="fd0bf477a5a3ac7f8380f2f2987d370916d20ab3ed1ccaa2e03c45a43d0b33a5" Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.308322 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zjn6s"] Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.314337 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zjn6s"] Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.335977 4563 scope.go:117] "RemoveContainer" containerID="44afe1ed8ab549ef39ef1c146967bee926749a149c398cfb33050193fe2fe8c6" Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.349083 4563 scope.go:117] "RemoveContainer" containerID="ffe0c46478a97bea63b9970fbe76f6cfd3e8c9bfdddec827cb3ee0814ce49389" Nov 24 09:06:52 crc kubenswrapper[4563]: E1124 09:06:52.349465 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe0c46478a97bea63b9970fbe76f6cfd3e8c9bfdddec827cb3ee0814ce49389\": container with ID starting with ffe0c46478a97bea63b9970fbe76f6cfd3e8c9bfdddec827cb3ee0814ce49389 not found: ID does not exist" containerID="ffe0c46478a97bea63b9970fbe76f6cfd3e8c9bfdddec827cb3ee0814ce49389" Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.349499 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe0c46478a97bea63b9970fbe76f6cfd3e8c9bfdddec827cb3ee0814ce49389"} err="failed to get container status \"ffe0c46478a97bea63b9970fbe76f6cfd3e8c9bfdddec827cb3ee0814ce49389\": rpc error: code = NotFound desc = could not find container \"ffe0c46478a97bea63b9970fbe76f6cfd3e8c9bfdddec827cb3ee0814ce49389\": container with ID starting with ffe0c46478a97bea63b9970fbe76f6cfd3e8c9bfdddec827cb3ee0814ce49389 not found: ID does not exist" Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.349521 4563 scope.go:117] "RemoveContainer" containerID="fd0bf477a5a3ac7f8380f2f2987d370916d20ab3ed1ccaa2e03c45a43d0b33a5" Nov 24 09:06:52 crc kubenswrapper[4563]: E1124 09:06:52.349849 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd0bf477a5a3ac7f8380f2f2987d370916d20ab3ed1ccaa2e03c45a43d0b33a5\": container with ID starting with fd0bf477a5a3ac7f8380f2f2987d370916d20ab3ed1ccaa2e03c45a43d0b33a5 not found: ID does not exist" containerID="fd0bf477a5a3ac7f8380f2f2987d370916d20ab3ed1ccaa2e03c45a43d0b33a5" Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.349871 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd0bf477a5a3ac7f8380f2f2987d370916d20ab3ed1ccaa2e03c45a43d0b33a5"} err="failed to get container status \"fd0bf477a5a3ac7f8380f2f2987d370916d20ab3ed1ccaa2e03c45a43d0b33a5\": rpc error: code = NotFound desc = could not find container \"fd0bf477a5a3ac7f8380f2f2987d370916d20ab3ed1ccaa2e03c45a43d0b33a5\": container with ID starting with fd0bf477a5a3ac7f8380f2f2987d370916d20ab3ed1ccaa2e03c45a43d0b33a5 not found: ID does not exist" Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.349886 4563 scope.go:117] "RemoveContainer" containerID="44afe1ed8ab549ef39ef1c146967bee926749a149c398cfb33050193fe2fe8c6" Nov 24 09:06:52 crc kubenswrapper[4563]: E1124 09:06:52.350111 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44afe1ed8ab549ef39ef1c146967bee926749a149c398cfb33050193fe2fe8c6\": container with ID starting with 44afe1ed8ab549ef39ef1c146967bee926749a149c398cfb33050193fe2fe8c6 not found: ID does not exist" containerID="44afe1ed8ab549ef39ef1c146967bee926749a149c398cfb33050193fe2fe8c6" Nov 24 09:06:52 crc kubenswrapper[4563]: I1124 09:06:52.350134 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44afe1ed8ab549ef39ef1c146967bee926749a149c398cfb33050193fe2fe8c6"} err="failed to get container status \"44afe1ed8ab549ef39ef1c146967bee926749a149c398cfb33050193fe2fe8c6\": rpc error: code = NotFound desc = could not find container \"44afe1ed8ab549ef39ef1c146967bee926749a149c398cfb33050193fe2fe8c6\": container with ID starting with 44afe1ed8ab549ef39ef1c146967bee926749a149c398cfb33050193fe2fe8c6 not found: ID does not exist" Nov 24 09:06:53 crc kubenswrapper[4563]: I1124 09:06:53.066389 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a8c9f6-6239-4258-8b3a-1d872a211813" path="/var/lib/kubelet/pods/a4a8c9f6-6239-4258-8b3a-1d872a211813/volumes" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.068623 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" podUID="245aea2a-4167-418a-910c-91bf4836c8dc" containerName="oauth-openshift" containerID="cri-o://00b57996bc38954c56b1780dec352c78b2db45b2ba61bcb2046a509c497d4c53" gracePeriod=15 Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.369951 4563 generic.go:334] "Generic (PLEG): container finished" podID="245aea2a-4167-418a-910c-91bf4836c8dc" containerID="00b57996bc38954c56b1780dec352c78b2db45b2ba61bcb2046a509c497d4c53" exitCode=0 Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.369988 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" event={"ID":"245aea2a-4167-418a-910c-91bf4836c8dc","Type":"ContainerDied","Data":"00b57996bc38954c56b1780dec352c78b2db45b2ba61bcb2046a509c497d4c53"} Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.413240 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.440459 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5db8794bf8-hrtkz"] Nov 24 09:07:08 crc kubenswrapper[4563]: E1124 09:07:08.440830 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5917ce-6f12-449f-9637-a94eff40aea4" containerName="registry-server" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.440919 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5917ce-6f12-449f-9637-a94eff40aea4" containerName="registry-server" Nov 24 09:07:08 crc kubenswrapper[4563]: E1124 09:07:08.440980 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a8c9f6-6239-4258-8b3a-1d872a211813" containerName="registry-server" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.441034 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a8c9f6-6239-4258-8b3a-1d872a211813" containerName="registry-server" Nov 24 09:07:08 crc kubenswrapper[4563]: E1124 09:07:08.441092 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d318ef99-9cb4-4f69-81dc-183d64e7532c" containerName="extract-utilities" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.441141 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d318ef99-9cb4-4f69-81dc-183d64e7532c" containerName="extract-utilities" Nov 24 09:07:08 crc kubenswrapper[4563]: E1124 09:07:08.441189 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a8c9f6-6239-4258-8b3a-1d872a211813" containerName="extract-content" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.441237 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a8c9f6-6239-4258-8b3a-1d872a211813" containerName="extract-content" Nov 24 09:07:08 crc kubenswrapper[4563]: E1124 09:07:08.441302 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d318ef99-9cb4-4f69-81dc-183d64e7532c" containerName="registry-server" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.441360 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d318ef99-9cb4-4f69-81dc-183d64e7532c" containerName="registry-server" Nov 24 09:07:08 crc kubenswrapper[4563]: E1124 09:07:08.441423 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2298fd99-7188-4326-a55f-9feb9b25d03d" containerName="extract-content" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.441472 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2298fd99-7188-4326-a55f-9feb9b25d03d" containerName="extract-content" Nov 24 09:07:08 crc kubenswrapper[4563]: E1124 09:07:08.441529 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020dab9d-2ca7-4fc4-b6c1-4094bd50677a" containerName="pruner" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.441579 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="020dab9d-2ca7-4fc4-b6c1-4094bd50677a" containerName="pruner" Nov 24 09:07:08 crc kubenswrapper[4563]: E1124 09:07:08.441660 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5917ce-6f12-449f-9637-a94eff40aea4" containerName="extract-content" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.441717 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5917ce-6f12-449f-9637-a94eff40aea4" containerName="extract-content" Nov 24 09:07:08 crc kubenswrapper[4563]: E1124 09:07:08.441767 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a8c9f6-6239-4258-8b3a-1d872a211813" containerName="extract-utilities" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.441823 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a8c9f6-6239-4258-8b3a-1d872a211813" containerName="extract-utilities" Nov 24 09:07:08 crc kubenswrapper[4563]: E1124 09:07:08.441871 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2298fd99-7188-4326-a55f-9feb9b25d03d" containerName="extract-utilities" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.441925 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2298fd99-7188-4326-a55f-9feb9b25d03d" containerName="extract-utilities" Nov 24 09:07:08 crc kubenswrapper[4563]: E1124 09:07:08.441978 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245aea2a-4167-418a-910c-91bf4836c8dc" containerName="oauth-openshift" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.442022 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="245aea2a-4167-418a-910c-91bf4836c8dc" containerName="oauth-openshift" Nov 24 09:07:08 crc kubenswrapper[4563]: E1124 09:07:08.442071 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5917ce-6f12-449f-9637-a94eff40aea4" containerName="extract-utilities" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.442115 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5917ce-6f12-449f-9637-a94eff40aea4" containerName="extract-utilities" Nov 24 09:07:08 crc kubenswrapper[4563]: E1124 09:07:08.442162 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2298fd99-7188-4326-a55f-9feb9b25d03d" containerName="registry-server" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.442204 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2298fd99-7188-4326-a55f-9feb9b25d03d" containerName="registry-server" Nov 24 09:07:08 crc kubenswrapper[4563]: E1124 09:07:08.442259 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d318ef99-9cb4-4f69-81dc-183d64e7532c" containerName="extract-content" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.442305 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d318ef99-9cb4-4f69-81dc-183d64e7532c" containerName="extract-content" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.442425 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b5917ce-6f12-449f-9637-a94eff40aea4" containerName="registry-server" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.442487 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="245aea2a-4167-418a-910c-91bf4836c8dc" containerName="oauth-openshift" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.442536 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="2298fd99-7188-4326-a55f-9feb9b25d03d" containerName="registry-server" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.442586 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="d318ef99-9cb4-4f69-81dc-183d64e7532c" containerName="registry-server" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.442700 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="020dab9d-2ca7-4fc4-b6c1-4094bd50677a" containerName="pruner" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.442768 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a8c9f6-6239-4258-8b3a-1d872a211813" containerName="registry-server" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.443184 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.452549 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5db8794bf8-hrtkz"] Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.506697 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-cliconfig\") pod \"245aea2a-4167-418a-910c-91bf4836c8dc\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.506870 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-idp-0-file-data\") pod \"245aea2a-4167-418a-910c-91bf4836c8dc\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.506964 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-provider-selection\") pod \"245aea2a-4167-418a-910c-91bf4836c8dc\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.507055 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-ocp-branding-template\") pod \"245aea2a-4167-418a-910c-91bf4836c8dc\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.507147 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-serving-cert\") pod \"245aea2a-4167-418a-910c-91bf4836c8dc\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.507246 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-service-ca\") pod \"245aea2a-4167-418a-910c-91bf4836c8dc\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.507358 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-session\") pod \"245aea2a-4167-418a-910c-91bf4836c8dc\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.507440 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-audit-policies\") pod \"245aea2a-4167-418a-910c-91bf4836c8dc\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.507530 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jk4l\" (UniqueName: \"kubernetes.io/projected/245aea2a-4167-418a-910c-91bf4836c8dc-kube-api-access-5jk4l\") pod \"245aea2a-4167-418a-910c-91bf4836c8dc\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.507609 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-login\") pod \"245aea2a-4167-418a-910c-91bf4836c8dc\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.507779 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-trusted-ca-bundle\") pod \"245aea2a-4167-418a-910c-91bf4836c8dc\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.507875 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-error\") pod \"245aea2a-4167-418a-910c-91bf4836c8dc\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.507526 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "245aea2a-4167-418a-910c-91bf4836c8dc" (UID: "245aea2a-4167-418a-910c-91bf4836c8dc"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.507890 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "245aea2a-4167-418a-910c-91bf4836c8dc" (UID: "245aea2a-4167-418a-910c-91bf4836c8dc"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.507916 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "245aea2a-4167-418a-910c-91bf4836c8dc" (UID: "245aea2a-4167-418a-910c-91bf4836c8dc"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.508093 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/245aea2a-4167-418a-910c-91bf4836c8dc-audit-dir\") pod \"245aea2a-4167-418a-910c-91bf4836c8dc\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.508146 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "245aea2a-4167-418a-910c-91bf4836c8dc" (UID: "245aea2a-4167-418a-910c-91bf4836c8dc"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.508162 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/245aea2a-4167-418a-910c-91bf4836c8dc-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "245aea2a-4167-418a-910c-91bf4836c8dc" (UID: "245aea2a-4167-418a-910c-91bf4836c8dc"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.508277 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-router-certs\") pod \"245aea2a-4167-418a-910c-91bf4836c8dc\" (UID: \"245aea2a-4167-418a-910c-91bf4836c8dc\") " Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.508500 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-session\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.508593 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f129a84e-be02-42c8-bfcf-fd4fb027988f-audit-policies\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.508714 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.508796 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-user-template-error\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.508863 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.508937 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.509020 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.509090 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f129a84e-be02-42c8-bfcf-fd4fb027988f-audit-dir\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.509236 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.509335 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-user-template-login\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.509394 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.509466 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.509489 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.509517 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfhxf\" (UniqueName: \"kubernetes.io/projected/f129a84e-be02-42c8-bfcf-fd4fb027988f-kube-api-access-kfhxf\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.509618 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.509671 4563 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.509686 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.509700 4563 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/245aea2a-4167-418a-910c-91bf4836c8dc-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.509713 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.512538 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "245aea2a-4167-418a-910c-91bf4836c8dc" (UID: "245aea2a-4167-418a-910c-91bf4836c8dc"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.512719 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "245aea2a-4167-418a-910c-91bf4836c8dc" (UID: "245aea2a-4167-418a-910c-91bf4836c8dc"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.513092 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "245aea2a-4167-418a-910c-91bf4836c8dc" (UID: "245aea2a-4167-418a-910c-91bf4836c8dc"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.513690 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/245aea2a-4167-418a-910c-91bf4836c8dc-kube-api-access-5jk4l" (OuterVolumeSpecName: "kube-api-access-5jk4l") pod "245aea2a-4167-418a-910c-91bf4836c8dc" (UID: "245aea2a-4167-418a-910c-91bf4836c8dc"). InnerVolumeSpecName "kube-api-access-5jk4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.514440 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "245aea2a-4167-418a-910c-91bf4836c8dc" (UID: "245aea2a-4167-418a-910c-91bf4836c8dc"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.514695 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "245aea2a-4167-418a-910c-91bf4836c8dc" (UID: "245aea2a-4167-418a-910c-91bf4836c8dc"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.515098 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "245aea2a-4167-418a-910c-91bf4836c8dc" (UID: "245aea2a-4167-418a-910c-91bf4836c8dc"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.515205 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "245aea2a-4167-418a-910c-91bf4836c8dc" (UID: "245aea2a-4167-418a-910c-91bf4836c8dc"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.515396 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "245aea2a-4167-418a-910c-91bf4836c8dc" (UID: "245aea2a-4167-418a-910c-91bf4836c8dc"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.610764 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.610848 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.610877 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfhxf\" (UniqueName: \"kubernetes.io/projected/f129a84e-be02-42c8-bfcf-fd4fb027988f-kube-api-access-kfhxf\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.610932 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-session\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.610960 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f129a84e-be02-42c8-bfcf-fd4fb027988f-audit-policies\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611005 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611026 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-user-template-error\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611047 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611089 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611115 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611161 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f129a84e-be02-42c8-bfcf-fd4fb027988f-audit-dir\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611188 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611245 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-user-template-login\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611272 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611321 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611334 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611344 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611356 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611367 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jk4l\" (UniqueName: \"kubernetes.io/projected/245aea2a-4167-418a-910c-91bf4836c8dc-kube-api-access-5jk4l\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611376 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611388 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611400 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.611413 4563 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/245aea2a-4167-418a-910c-91bf4836c8dc-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.612285 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f129a84e-be02-42c8-bfcf-fd4fb027988f-audit-dir\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.612320 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.612489 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f129a84e-be02-42c8-bfcf-fd4fb027988f-audit-policies\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.612718 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.613023 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.614666 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-user-template-error\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.614851 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.615426 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.615868 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.616091 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-session\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.616451 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.617396 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.617589 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f129a84e-be02-42c8-bfcf-fd4fb027988f-v4-0-config-user-template-login\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.626544 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfhxf\" (UniqueName: \"kubernetes.io/projected/f129a84e-be02-42c8-bfcf-fd4fb027988f-kube-api-access-kfhxf\") pod \"oauth-openshift-5db8794bf8-hrtkz\" (UID: \"f129a84e-be02-42c8-bfcf-fd4fb027988f\") " pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.755037 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.987518 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.987577 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.987627 4563 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.988217 4563 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd"} pod="openshift-machine-config-operator/machine-config-daemon-stlxr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 09:07:08 crc kubenswrapper[4563]: I1124 09:07:08.988277 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" containerID="cri-o://6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd" gracePeriod=600 Nov 24 09:07:09 crc kubenswrapper[4563]: I1124 09:07:09.113057 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5db8794bf8-hrtkz"] Nov 24 09:07:09 crc kubenswrapper[4563]: I1124 09:07:09.376623 4563 generic.go:334] "Generic (PLEG): container finished" podID="3b2bfe55-8989-49b3-bb61-e28189447627" containerID="6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd" exitCode=0 Nov 24 09:07:09 crc kubenswrapper[4563]: I1124 09:07:09.376691 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerDied","Data":"6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd"} Nov 24 09:07:09 crc kubenswrapper[4563]: I1124 09:07:09.376898 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"693aaa2fd38048eca425d5cf8bf8e834a76fe87db5bd736efd4bf0270f272397"} Nov 24 09:07:09 crc kubenswrapper[4563]: I1124 09:07:09.378269 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" event={"ID":"f129a84e-be02-42c8-bfcf-fd4fb027988f","Type":"ContainerStarted","Data":"8a72d1ad65827c675d2a2a045c35b576ac7880ac08cf53ba4aade97db7c9cee7"} Nov 24 09:07:09 crc kubenswrapper[4563]: I1124 09:07:09.378324 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" event={"ID":"f129a84e-be02-42c8-bfcf-fd4fb027988f","Type":"ContainerStarted","Data":"178ebeaeb1fb517d6d53a94d4fb2d16ff0bc2add24485833538e233f07372410"} Nov 24 09:07:09 crc kubenswrapper[4563]: I1124 09:07:09.378445 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:09 crc kubenswrapper[4563]: I1124 09:07:09.379706 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" event={"ID":"245aea2a-4167-418a-910c-91bf4836c8dc","Type":"ContainerDied","Data":"ff96cd001028e801d0699fc728f28fe7b7bba50740c32f1bacfc3dc72547a875"} Nov 24 09:07:09 crc kubenswrapper[4563]: I1124 09:07:09.379748 4563 scope.go:117] "RemoveContainer" containerID="00b57996bc38954c56b1780dec352c78b2db45b2ba61bcb2046a509c497d4c53" Nov 24 09:07:09 crc kubenswrapper[4563]: I1124 09:07:09.379806 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-nnx6h" Nov 24 09:07:09 crc kubenswrapper[4563]: I1124 09:07:09.411143 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" podStartSLOduration=26.411133217 podStartE2EDuration="26.411133217s" podCreationTimestamp="2025-11-24 09:06:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:07:09.409710635 +0000 UTC m=+206.668688082" watchObservedRunningTime="2025-11-24 09:07:09.411133217 +0000 UTC m=+206.670110664" Nov 24 09:07:09 crc kubenswrapper[4563]: I1124 09:07:09.420229 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nnx6h"] Nov 24 09:07:09 crc kubenswrapper[4563]: I1124 09:07:09.423007 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-nnx6h"] Nov 24 09:07:09 crc kubenswrapper[4563]: I1124 09:07:09.719408 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5db8794bf8-hrtkz" Nov 24 09:07:11 crc kubenswrapper[4563]: I1124 09:07:11.063269 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="245aea2a-4167-418a-910c-91bf4836c8dc" path="/var/lib/kubelet/pods/245aea2a-4167-418a-910c-91bf4836c8dc/volumes" Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.782654 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fkz2h"] Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.783530 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fkz2h" podUID="b8c919bf-e04d-4c09-84fa-064f434383bb" containerName="registry-server" containerID="cri-o://1a7fcfc73514912064910e1c8c5ca88be3b96eaceca810b00db895b968d14343" gracePeriod=30 Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.789716 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jhn4v"] Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.789918 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jhn4v" podUID="c852825f-7fac-45d3-b801-ae6eb253989a" containerName="registry-server" containerID="cri-o://bd1b821eb6953f3bee8ea0a55411fe55393dd64decd840c91a41df3c982c6924" gracePeriod=30 Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.809875 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmp5c"] Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.810112 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" podUID="4e13a5b1-f9f7-4045-952a-a44cfd536a99" containerName="marketplace-operator" containerID="cri-o://2888f7358774a3fccbfb2cfda7ab8cd22bbb1457aa038676a408e6e31aaa14e3" gracePeriod=30 Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.824367 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp6lh"] Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.824632 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wp6lh" podUID="41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" containerName="registry-server" containerID="cri-o://2037f7894bd1ebe2835c26ea5f66c5ca7178fef508a345ba186f128d502f0fe5" gracePeriod=30 Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.827102 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kltmd"] Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.827471 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kltmd" podUID="10d7e769-e95c-468d-b220-1fae07708825" containerName="registry-server" containerID="cri-o://4c9e7f41601e4dc7bea0e711a68fb94187c68eeec6fb26ffb9b367eb60ea769e" gracePeriod=30 Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.832759 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5pxln"] Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.833468 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.835804 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5pxln"] Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.873983 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/157ed1a3-ea31-4a6b-8e91-2852d4c50600-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5pxln\" (UID: \"157ed1a3-ea31-4a6b-8e91-2852d4c50600\") " pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.874023 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9nfm\" (UniqueName: \"kubernetes.io/projected/157ed1a3-ea31-4a6b-8e91-2852d4c50600-kube-api-access-z9nfm\") pod \"marketplace-operator-79b997595-5pxln\" (UID: \"157ed1a3-ea31-4a6b-8e91-2852d4c50600\") " pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.874052 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/157ed1a3-ea31-4a6b-8e91-2852d4c50600-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5pxln\" (UID: \"157ed1a3-ea31-4a6b-8e91-2852d4c50600\") " pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.976050 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/157ed1a3-ea31-4a6b-8e91-2852d4c50600-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5pxln\" (UID: \"157ed1a3-ea31-4a6b-8e91-2852d4c50600\") " pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.976094 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9nfm\" (UniqueName: \"kubernetes.io/projected/157ed1a3-ea31-4a6b-8e91-2852d4c50600-kube-api-access-z9nfm\") pod \"marketplace-operator-79b997595-5pxln\" (UID: \"157ed1a3-ea31-4a6b-8e91-2852d4c50600\") " pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.976122 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/157ed1a3-ea31-4a6b-8e91-2852d4c50600-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5pxln\" (UID: \"157ed1a3-ea31-4a6b-8e91-2852d4c50600\") " pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.977791 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/157ed1a3-ea31-4a6b-8e91-2852d4c50600-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5pxln\" (UID: \"157ed1a3-ea31-4a6b-8e91-2852d4c50600\") " pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.981893 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/157ed1a3-ea31-4a6b-8e91-2852d4c50600-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5pxln\" (UID: \"157ed1a3-ea31-4a6b-8e91-2852d4c50600\") " pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" Nov 24 09:07:41 crc kubenswrapper[4563]: I1124 09:07:41.994777 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9nfm\" (UniqueName: \"kubernetes.io/projected/157ed1a3-ea31-4a6b-8e91-2852d4c50600-kube-api-access-z9nfm\") pod \"marketplace-operator-79b997595-5pxln\" (UID: \"157ed1a3-ea31-4a6b-8e91-2852d4c50600\") " pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.202754 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.206475 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.211352 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.213977 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.218216 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.247667 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.281766 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkmgq\" (UniqueName: \"kubernetes.io/projected/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-kube-api-access-qkmgq\") pod \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\" (UID: \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.281808 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-catalog-content\") pod \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\" (UID: \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.281841 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26hcg\" (UniqueName: \"kubernetes.io/projected/10d7e769-e95c-468d-b220-1fae07708825-kube-api-access-26hcg\") pod \"10d7e769-e95c-468d-b220-1fae07708825\" (UID: \"10d7e769-e95c-468d-b220-1fae07708825\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.281869 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d7e769-e95c-468d-b220-1fae07708825-catalog-content\") pod \"10d7e769-e95c-468d-b220-1fae07708825\" (UID: \"10d7e769-e95c-468d-b220-1fae07708825\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.281903 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pthfb\" (UniqueName: \"kubernetes.io/projected/b8c919bf-e04d-4c09-84fa-064f434383bb-kube-api-access-pthfb\") pod \"b8c919bf-e04d-4c09-84fa-064f434383bb\" (UID: \"b8c919bf-e04d-4c09-84fa-064f434383bb\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.281926 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c852825f-7fac-45d3-b801-ae6eb253989a-utilities\") pod \"c852825f-7fac-45d3-b801-ae6eb253989a\" (UID: \"c852825f-7fac-45d3-b801-ae6eb253989a\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.281941 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d7e769-e95c-468d-b220-1fae07708825-utilities\") pod \"10d7e769-e95c-468d-b220-1fae07708825\" (UID: \"10d7e769-e95c-468d-b220-1fae07708825\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.281968 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8c919bf-e04d-4c09-84fa-064f434383bb-catalog-content\") pod \"b8c919bf-e04d-4c09-84fa-064f434383bb\" (UID: \"b8c919bf-e04d-4c09-84fa-064f434383bb\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.281988 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw8cf\" (UniqueName: \"kubernetes.io/projected/c852825f-7fac-45d3-b801-ae6eb253989a-kube-api-access-rw8cf\") pod \"c852825f-7fac-45d3-b801-ae6eb253989a\" (UID: \"c852825f-7fac-45d3-b801-ae6eb253989a\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.282007 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-operator-metrics\") pod \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\" (UID: \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.282037 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c852825f-7fac-45d3-b801-ae6eb253989a-catalog-content\") pod \"c852825f-7fac-45d3-b801-ae6eb253989a\" (UID: \"c852825f-7fac-45d3-b801-ae6eb253989a\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.282604 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-utilities\") pod \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\" (UID: \"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.282649 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffvcw\" (UniqueName: \"kubernetes.io/projected/4e13a5b1-f9f7-4045-952a-a44cfd536a99-kube-api-access-ffvcw\") pod \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\" (UID: \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.282668 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-trusted-ca\") pod \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\" (UID: \"4e13a5b1-f9f7-4045-952a-a44cfd536a99\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.282694 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8c919bf-e04d-4c09-84fa-064f434383bb-utilities\") pod \"b8c919bf-e04d-4c09-84fa-064f434383bb\" (UID: \"b8c919bf-e04d-4c09-84fa-064f434383bb\") " Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.285084 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10d7e769-e95c-468d-b220-1fae07708825-utilities" (OuterVolumeSpecName: "utilities") pod "10d7e769-e95c-468d-b220-1fae07708825" (UID: "10d7e769-e95c-468d-b220-1fae07708825"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.286876 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8c919bf-e04d-4c09-84fa-064f434383bb-utilities" (OuterVolumeSpecName: "utilities") pod "b8c919bf-e04d-4c09-84fa-064f434383bb" (UID: "b8c919bf-e04d-4c09-84fa-064f434383bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.292028 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8c919bf-e04d-4c09-84fa-064f434383bb-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.292052 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d7e769-e95c-468d-b220-1fae07708825-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.292738 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-kube-api-access-qkmgq" (OuterVolumeSpecName: "kube-api-access-qkmgq") pod "41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" (UID: "41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e"). InnerVolumeSpecName "kube-api-access-qkmgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.295308 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-utilities" (OuterVolumeSpecName: "utilities") pod "41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" (UID: "41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.296497 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8c919bf-e04d-4c09-84fa-064f434383bb-kube-api-access-pthfb" (OuterVolumeSpecName: "kube-api-access-pthfb") pod "b8c919bf-e04d-4c09-84fa-064f434383bb" (UID: "b8c919bf-e04d-4c09-84fa-064f434383bb"). InnerVolumeSpecName "kube-api-access-pthfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.296891 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "4e13a5b1-f9f7-4045-952a-a44cfd536a99" (UID: "4e13a5b1-f9f7-4045-952a-a44cfd536a99"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.297600 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e13a5b1-f9f7-4045-952a-a44cfd536a99-kube-api-access-ffvcw" (OuterVolumeSpecName: "kube-api-access-ffvcw") pod "4e13a5b1-f9f7-4045-952a-a44cfd536a99" (UID: "4e13a5b1-f9f7-4045-952a-a44cfd536a99"). InnerVolumeSpecName "kube-api-access-ffvcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.299217 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c852825f-7fac-45d3-b801-ae6eb253989a-utilities" (OuterVolumeSpecName: "utilities") pod "c852825f-7fac-45d3-b801-ae6eb253989a" (UID: "c852825f-7fac-45d3-b801-ae6eb253989a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.299344 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "4e13a5b1-f9f7-4045-952a-a44cfd536a99" (UID: "4e13a5b1-f9f7-4045-952a-a44cfd536a99"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.299505 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c852825f-7fac-45d3-b801-ae6eb253989a-kube-api-access-rw8cf" (OuterVolumeSpecName: "kube-api-access-rw8cf") pod "c852825f-7fac-45d3-b801-ae6eb253989a" (UID: "c852825f-7fac-45d3-b801-ae6eb253989a"). InnerVolumeSpecName "kube-api-access-rw8cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.302036 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d7e769-e95c-468d-b220-1fae07708825-kube-api-access-26hcg" (OuterVolumeSpecName: "kube-api-access-26hcg") pod "10d7e769-e95c-468d-b220-1fae07708825" (UID: "10d7e769-e95c-468d-b220-1fae07708825"). InnerVolumeSpecName "kube-api-access-26hcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.311203 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" (UID: "41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.339988 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8c919bf-e04d-4c09-84fa-064f434383bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8c919bf-e04d-4c09-84fa-064f434383bb" (UID: "b8c919bf-e04d-4c09-84fa-064f434383bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.349832 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c852825f-7fac-45d3-b801-ae6eb253989a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c852825f-7fac-45d3-b801-ae6eb253989a" (UID: "c852825f-7fac-45d3-b801-ae6eb253989a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.386491 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10d7e769-e95c-468d-b220-1fae07708825-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10d7e769-e95c-468d-b220-1fae07708825" (UID: "10d7e769-e95c-468d-b220-1fae07708825"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.393319 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8c919bf-e04d-4c09-84fa-064f434383bb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.393340 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw8cf\" (UniqueName: \"kubernetes.io/projected/c852825f-7fac-45d3-b801-ae6eb253989a-kube-api-access-rw8cf\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.393351 4563 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.393361 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c852825f-7fac-45d3-b801-ae6eb253989a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.393371 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.393379 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffvcw\" (UniqueName: \"kubernetes.io/projected/4e13a5b1-f9f7-4045-952a-a44cfd536a99-kube-api-access-ffvcw\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.393390 4563 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e13a5b1-f9f7-4045-952a-a44cfd536a99-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.393397 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkmgq\" (UniqueName: \"kubernetes.io/projected/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-kube-api-access-qkmgq\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.393405 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.393413 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26hcg\" (UniqueName: \"kubernetes.io/projected/10d7e769-e95c-468d-b220-1fae07708825-kube-api-access-26hcg\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.393420 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d7e769-e95c-468d-b220-1fae07708825-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.393427 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pthfb\" (UniqueName: \"kubernetes.io/projected/b8c919bf-e04d-4c09-84fa-064f434383bb-kube-api-access-pthfb\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.393435 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c852825f-7fac-45d3-b801-ae6eb253989a-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.543321 4563 generic.go:334] "Generic (PLEG): container finished" podID="c852825f-7fac-45d3-b801-ae6eb253989a" containerID="bd1b821eb6953f3bee8ea0a55411fe55393dd64decd840c91a41df3c982c6924" exitCode=0 Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.543383 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jhn4v" event={"ID":"c852825f-7fac-45d3-b801-ae6eb253989a","Type":"ContainerDied","Data":"bd1b821eb6953f3bee8ea0a55411fe55393dd64decd840c91a41df3c982c6924"} Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.543616 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jhn4v" event={"ID":"c852825f-7fac-45d3-b801-ae6eb253989a","Type":"ContainerDied","Data":"45f128b334beed5a49273d65c5bcd6ed1801f80f16074811720e183272613d13"} Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.543410 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jhn4v" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.543869 4563 scope.go:117] "RemoveContainer" containerID="bd1b821eb6953f3bee8ea0a55411fe55393dd64decd840c91a41df3c982c6924" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.546207 4563 generic.go:334] "Generic (PLEG): container finished" podID="41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" containerID="2037f7894bd1ebe2835c26ea5f66c5ca7178fef508a345ba186f128d502f0fe5" exitCode=0 Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.546253 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp6lh" event={"ID":"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e","Type":"ContainerDied","Data":"2037f7894bd1ebe2835c26ea5f66c5ca7178fef508a345ba186f128d502f0fe5"} Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.546270 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp6lh" event={"ID":"41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e","Type":"ContainerDied","Data":"a5ebc21ba3361324260a7420c40362257ef2ba5fe7f8935bca872321353877bd"} Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.546313 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wp6lh" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.548118 4563 generic.go:334] "Generic (PLEG): container finished" podID="b8c919bf-e04d-4c09-84fa-064f434383bb" containerID="1a7fcfc73514912064910e1c8c5ca88be3b96eaceca810b00db895b968d14343" exitCode=0 Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.548219 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fkz2h" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.548259 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fkz2h" event={"ID":"b8c919bf-e04d-4c09-84fa-064f434383bb","Type":"ContainerDied","Data":"1a7fcfc73514912064910e1c8c5ca88be3b96eaceca810b00db895b968d14343"} Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.548279 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fkz2h" event={"ID":"b8c919bf-e04d-4c09-84fa-064f434383bb","Type":"ContainerDied","Data":"aa0d08f4669337a7e1f3a6b8538bbe8bb26fc7fe1aa90d36bd878047f72ec099"} Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.550529 4563 generic.go:334] "Generic (PLEG): container finished" podID="10d7e769-e95c-468d-b220-1fae07708825" containerID="4c9e7f41601e4dc7bea0e711a68fb94187c68eeec6fb26ffb9b367eb60ea769e" exitCode=0 Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.550627 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kltmd" event={"ID":"10d7e769-e95c-468d-b220-1fae07708825","Type":"ContainerDied","Data":"4c9e7f41601e4dc7bea0e711a68fb94187c68eeec6fb26ffb9b367eb60ea769e"} Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.550676 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kltmd" event={"ID":"10d7e769-e95c-468d-b220-1fae07708825","Type":"ContainerDied","Data":"be3d98eca2437c649edab4c862648ab229da7d19e7d5681db7f7f8f27c7fd545"} Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.550837 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kltmd" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.551694 4563 generic.go:334] "Generic (PLEG): container finished" podID="4e13a5b1-f9f7-4045-952a-a44cfd536a99" containerID="2888f7358774a3fccbfb2cfda7ab8cd22bbb1457aa038676a408e6e31aaa14e3" exitCode=0 Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.551728 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" event={"ID":"4e13a5b1-f9f7-4045-952a-a44cfd536a99","Type":"ContainerDied","Data":"2888f7358774a3fccbfb2cfda7ab8cd22bbb1457aa038676a408e6e31aaa14e3"} Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.551747 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" event={"ID":"4e13a5b1-f9f7-4045-952a-a44cfd536a99","Type":"ContainerDied","Data":"1d83dd74979d0554ff7d7a41928c87f4cec5e72ae69b7286d0e7c94af6236873"} Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.551781 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zmp5c" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.558348 4563 scope.go:117] "RemoveContainer" containerID="5b43dc66373a40d2d90857babe4c6d78d306cd5267139346eda971a3652a0308" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.570796 4563 scope.go:117] "RemoveContainer" containerID="feb154e7eca53ac0dcf14cdc0f2bd695742fbe6124fafe2de04a6c1679f1f212" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.577731 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fkz2h"] Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.586124 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fkz2h"] Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.590603 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kltmd"] Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.592338 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kltmd"] Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.598945 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmp5c"] Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.600393 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zmp5c"] Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.609624 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp6lh"] Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.610847 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp6lh"] Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.614370 4563 scope.go:117] "RemoveContainer" containerID="bd1b821eb6953f3bee8ea0a55411fe55393dd64decd840c91a41df3c982c6924" Nov 24 09:07:42 crc kubenswrapper[4563]: E1124 09:07:42.614913 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd1b821eb6953f3bee8ea0a55411fe55393dd64decd840c91a41df3c982c6924\": container with ID starting with bd1b821eb6953f3bee8ea0a55411fe55393dd64decd840c91a41df3c982c6924 not found: ID does not exist" containerID="bd1b821eb6953f3bee8ea0a55411fe55393dd64decd840c91a41df3c982c6924" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.615010 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1b821eb6953f3bee8ea0a55411fe55393dd64decd840c91a41df3c982c6924"} err="failed to get container status \"bd1b821eb6953f3bee8ea0a55411fe55393dd64decd840c91a41df3c982c6924\": rpc error: code = NotFound desc = could not find container \"bd1b821eb6953f3bee8ea0a55411fe55393dd64decd840c91a41df3c982c6924\": container with ID starting with bd1b821eb6953f3bee8ea0a55411fe55393dd64decd840c91a41df3c982c6924 not found: ID does not exist" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.615093 4563 scope.go:117] "RemoveContainer" containerID="5b43dc66373a40d2d90857babe4c6d78d306cd5267139346eda971a3652a0308" Nov 24 09:07:42 crc kubenswrapper[4563]: E1124 09:07:42.615852 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b43dc66373a40d2d90857babe4c6d78d306cd5267139346eda971a3652a0308\": container with ID starting with 5b43dc66373a40d2d90857babe4c6d78d306cd5267139346eda971a3652a0308 not found: ID does not exist" containerID="5b43dc66373a40d2d90857babe4c6d78d306cd5267139346eda971a3652a0308" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.615934 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b43dc66373a40d2d90857babe4c6d78d306cd5267139346eda971a3652a0308"} err="failed to get container status \"5b43dc66373a40d2d90857babe4c6d78d306cd5267139346eda971a3652a0308\": rpc error: code = NotFound desc = could not find container \"5b43dc66373a40d2d90857babe4c6d78d306cd5267139346eda971a3652a0308\": container with ID starting with 5b43dc66373a40d2d90857babe4c6d78d306cd5267139346eda971a3652a0308 not found: ID does not exist" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.615998 4563 scope.go:117] "RemoveContainer" containerID="feb154e7eca53ac0dcf14cdc0f2bd695742fbe6124fafe2de04a6c1679f1f212" Nov 24 09:07:42 crc kubenswrapper[4563]: E1124 09:07:42.616300 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feb154e7eca53ac0dcf14cdc0f2bd695742fbe6124fafe2de04a6c1679f1f212\": container with ID starting with feb154e7eca53ac0dcf14cdc0f2bd695742fbe6124fafe2de04a6c1679f1f212 not found: ID does not exist" containerID="feb154e7eca53ac0dcf14cdc0f2bd695742fbe6124fafe2de04a6c1679f1f212" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.616379 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feb154e7eca53ac0dcf14cdc0f2bd695742fbe6124fafe2de04a6c1679f1f212"} err="failed to get container status \"feb154e7eca53ac0dcf14cdc0f2bd695742fbe6124fafe2de04a6c1679f1f212\": rpc error: code = NotFound desc = could not find container \"feb154e7eca53ac0dcf14cdc0f2bd695742fbe6124fafe2de04a6c1679f1f212\": container with ID starting with feb154e7eca53ac0dcf14cdc0f2bd695742fbe6124fafe2de04a6c1679f1f212 not found: ID does not exist" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.616447 4563 scope.go:117] "RemoveContainer" containerID="2037f7894bd1ebe2835c26ea5f66c5ca7178fef508a345ba186f128d502f0fe5" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.617208 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jhn4v"] Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.622917 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jhn4v"] Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.624556 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5pxln"] Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.629577 4563 scope.go:117] "RemoveContainer" containerID="381ff15e3c62d403517b73d2204fa792d5aeb0e7ba0ce5d44e2a59e4e666b5bb" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.640874 4563 scope.go:117] "RemoveContainer" containerID="78c7685b989084a92e118cb38e1e0c74fc250f824d0a0f998a7b59173d947a82" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.652671 4563 scope.go:117] "RemoveContainer" containerID="2037f7894bd1ebe2835c26ea5f66c5ca7178fef508a345ba186f128d502f0fe5" Nov 24 09:07:42 crc kubenswrapper[4563]: E1124 09:07:42.653007 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2037f7894bd1ebe2835c26ea5f66c5ca7178fef508a345ba186f128d502f0fe5\": container with ID starting with 2037f7894bd1ebe2835c26ea5f66c5ca7178fef508a345ba186f128d502f0fe5 not found: ID does not exist" containerID="2037f7894bd1ebe2835c26ea5f66c5ca7178fef508a345ba186f128d502f0fe5" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.653041 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2037f7894bd1ebe2835c26ea5f66c5ca7178fef508a345ba186f128d502f0fe5"} err="failed to get container status \"2037f7894bd1ebe2835c26ea5f66c5ca7178fef508a345ba186f128d502f0fe5\": rpc error: code = NotFound desc = could not find container \"2037f7894bd1ebe2835c26ea5f66c5ca7178fef508a345ba186f128d502f0fe5\": container with ID starting with 2037f7894bd1ebe2835c26ea5f66c5ca7178fef508a345ba186f128d502f0fe5 not found: ID does not exist" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.653062 4563 scope.go:117] "RemoveContainer" containerID="381ff15e3c62d403517b73d2204fa792d5aeb0e7ba0ce5d44e2a59e4e666b5bb" Nov 24 09:07:42 crc kubenswrapper[4563]: E1124 09:07:42.653385 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"381ff15e3c62d403517b73d2204fa792d5aeb0e7ba0ce5d44e2a59e4e666b5bb\": container with ID starting with 381ff15e3c62d403517b73d2204fa792d5aeb0e7ba0ce5d44e2a59e4e666b5bb not found: ID does not exist" containerID="381ff15e3c62d403517b73d2204fa792d5aeb0e7ba0ce5d44e2a59e4e666b5bb" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.653478 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381ff15e3c62d403517b73d2204fa792d5aeb0e7ba0ce5d44e2a59e4e666b5bb"} err="failed to get container status \"381ff15e3c62d403517b73d2204fa792d5aeb0e7ba0ce5d44e2a59e4e666b5bb\": rpc error: code = NotFound desc = could not find container \"381ff15e3c62d403517b73d2204fa792d5aeb0e7ba0ce5d44e2a59e4e666b5bb\": container with ID starting with 381ff15e3c62d403517b73d2204fa792d5aeb0e7ba0ce5d44e2a59e4e666b5bb not found: ID does not exist" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.653561 4563 scope.go:117] "RemoveContainer" containerID="78c7685b989084a92e118cb38e1e0c74fc250f824d0a0f998a7b59173d947a82" Nov 24 09:07:42 crc kubenswrapper[4563]: E1124 09:07:42.653957 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78c7685b989084a92e118cb38e1e0c74fc250f824d0a0f998a7b59173d947a82\": container with ID starting with 78c7685b989084a92e118cb38e1e0c74fc250f824d0a0f998a7b59173d947a82 not found: ID does not exist" containerID="78c7685b989084a92e118cb38e1e0c74fc250f824d0a0f998a7b59173d947a82" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.654048 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78c7685b989084a92e118cb38e1e0c74fc250f824d0a0f998a7b59173d947a82"} err="failed to get container status \"78c7685b989084a92e118cb38e1e0c74fc250f824d0a0f998a7b59173d947a82\": rpc error: code = NotFound desc = could not find container \"78c7685b989084a92e118cb38e1e0c74fc250f824d0a0f998a7b59173d947a82\": container with ID starting with 78c7685b989084a92e118cb38e1e0c74fc250f824d0a0f998a7b59173d947a82 not found: ID does not exist" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.654138 4563 scope.go:117] "RemoveContainer" containerID="1a7fcfc73514912064910e1c8c5ca88be3b96eaceca810b00db895b968d14343" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.666026 4563 scope.go:117] "RemoveContainer" containerID="7f57ddaf491675a3c3ae0d2b3ff772f54141c8e9f97e29882570b63a73a651d6" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.687192 4563 scope.go:117] "RemoveContainer" containerID="1f4162972c8803809f39b9c4014d14149a06e623b12b89ee3fcc1c89357efbe7" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.707177 4563 scope.go:117] "RemoveContainer" containerID="1a7fcfc73514912064910e1c8c5ca88be3b96eaceca810b00db895b968d14343" Nov 24 09:07:42 crc kubenswrapper[4563]: E1124 09:07:42.707531 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7fcfc73514912064910e1c8c5ca88be3b96eaceca810b00db895b968d14343\": container with ID starting with 1a7fcfc73514912064910e1c8c5ca88be3b96eaceca810b00db895b968d14343 not found: ID does not exist" containerID="1a7fcfc73514912064910e1c8c5ca88be3b96eaceca810b00db895b968d14343" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.707575 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7fcfc73514912064910e1c8c5ca88be3b96eaceca810b00db895b968d14343"} err="failed to get container status \"1a7fcfc73514912064910e1c8c5ca88be3b96eaceca810b00db895b968d14343\": rpc error: code = NotFound desc = could not find container \"1a7fcfc73514912064910e1c8c5ca88be3b96eaceca810b00db895b968d14343\": container with ID starting with 1a7fcfc73514912064910e1c8c5ca88be3b96eaceca810b00db895b968d14343 not found: ID does not exist" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.707604 4563 scope.go:117] "RemoveContainer" containerID="7f57ddaf491675a3c3ae0d2b3ff772f54141c8e9f97e29882570b63a73a651d6" Nov 24 09:07:42 crc kubenswrapper[4563]: E1124 09:07:42.708259 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f57ddaf491675a3c3ae0d2b3ff772f54141c8e9f97e29882570b63a73a651d6\": container with ID starting with 7f57ddaf491675a3c3ae0d2b3ff772f54141c8e9f97e29882570b63a73a651d6 not found: ID does not exist" containerID="7f57ddaf491675a3c3ae0d2b3ff772f54141c8e9f97e29882570b63a73a651d6" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.708283 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f57ddaf491675a3c3ae0d2b3ff772f54141c8e9f97e29882570b63a73a651d6"} err="failed to get container status \"7f57ddaf491675a3c3ae0d2b3ff772f54141c8e9f97e29882570b63a73a651d6\": rpc error: code = NotFound desc = could not find container \"7f57ddaf491675a3c3ae0d2b3ff772f54141c8e9f97e29882570b63a73a651d6\": container with ID starting with 7f57ddaf491675a3c3ae0d2b3ff772f54141c8e9f97e29882570b63a73a651d6 not found: ID does not exist" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.708300 4563 scope.go:117] "RemoveContainer" containerID="1f4162972c8803809f39b9c4014d14149a06e623b12b89ee3fcc1c89357efbe7" Nov 24 09:07:42 crc kubenswrapper[4563]: E1124 09:07:42.708532 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4162972c8803809f39b9c4014d14149a06e623b12b89ee3fcc1c89357efbe7\": container with ID starting with 1f4162972c8803809f39b9c4014d14149a06e623b12b89ee3fcc1c89357efbe7 not found: ID does not exist" containerID="1f4162972c8803809f39b9c4014d14149a06e623b12b89ee3fcc1c89357efbe7" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.708553 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4162972c8803809f39b9c4014d14149a06e623b12b89ee3fcc1c89357efbe7"} err="failed to get container status \"1f4162972c8803809f39b9c4014d14149a06e623b12b89ee3fcc1c89357efbe7\": rpc error: code = NotFound desc = could not find container \"1f4162972c8803809f39b9c4014d14149a06e623b12b89ee3fcc1c89357efbe7\": container with ID starting with 1f4162972c8803809f39b9c4014d14149a06e623b12b89ee3fcc1c89357efbe7 not found: ID does not exist" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.708565 4563 scope.go:117] "RemoveContainer" containerID="4c9e7f41601e4dc7bea0e711a68fb94187c68eeec6fb26ffb9b367eb60ea769e" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.721194 4563 scope.go:117] "RemoveContainer" containerID="aa308ea2e9a775af023fa218ee0aa86d391c75d709d10dba15393a1eff10c1f4" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.733666 4563 scope.go:117] "RemoveContainer" containerID="e5eb125a1f1333c2c959936f98c7fe61404bcd6a6d702e4bb0e7b8207b8b0ee6" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.746873 4563 scope.go:117] "RemoveContainer" containerID="4c9e7f41601e4dc7bea0e711a68fb94187c68eeec6fb26ffb9b367eb60ea769e" Nov 24 09:07:42 crc kubenswrapper[4563]: E1124 09:07:42.747366 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c9e7f41601e4dc7bea0e711a68fb94187c68eeec6fb26ffb9b367eb60ea769e\": container with ID starting with 4c9e7f41601e4dc7bea0e711a68fb94187c68eeec6fb26ffb9b367eb60ea769e not found: ID does not exist" containerID="4c9e7f41601e4dc7bea0e711a68fb94187c68eeec6fb26ffb9b367eb60ea769e" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.747399 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9e7f41601e4dc7bea0e711a68fb94187c68eeec6fb26ffb9b367eb60ea769e"} err="failed to get container status \"4c9e7f41601e4dc7bea0e711a68fb94187c68eeec6fb26ffb9b367eb60ea769e\": rpc error: code = NotFound desc = could not find container \"4c9e7f41601e4dc7bea0e711a68fb94187c68eeec6fb26ffb9b367eb60ea769e\": container with ID starting with 4c9e7f41601e4dc7bea0e711a68fb94187c68eeec6fb26ffb9b367eb60ea769e not found: ID does not exist" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.747421 4563 scope.go:117] "RemoveContainer" containerID="aa308ea2e9a775af023fa218ee0aa86d391c75d709d10dba15393a1eff10c1f4" Nov 24 09:07:42 crc kubenswrapper[4563]: E1124 09:07:42.748339 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa308ea2e9a775af023fa218ee0aa86d391c75d709d10dba15393a1eff10c1f4\": container with ID starting with aa308ea2e9a775af023fa218ee0aa86d391c75d709d10dba15393a1eff10c1f4 not found: ID does not exist" containerID="aa308ea2e9a775af023fa218ee0aa86d391c75d709d10dba15393a1eff10c1f4" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.748361 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa308ea2e9a775af023fa218ee0aa86d391c75d709d10dba15393a1eff10c1f4"} err="failed to get container status \"aa308ea2e9a775af023fa218ee0aa86d391c75d709d10dba15393a1eff10c1f4\": rpc error: code = NotFound desc = could not find container \"aa308ea2e9a775af023fa218ee0aa86d391c75d709d10dba15393a1eff10c1f4\": container with ID starting with aa308ea2e9a775af023fa218ee0aa86d391c75d709d10dba15393a1eff10c1f4 not found: ID does not exist" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.748394 4563 scope.go:117] "RemoveContainer" containerID="e5eb125a1f1333c2c959936f98c7fe61404bcd6a6d702e4bb0e7b8207b8b0ee6" Nov 24 09:07:42 crc kubenswrapper[4563]: E1124 09:07:42.748733 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5eb125a1f1333c2c959936f98c7fe61404bcd6a6d702e4bb0e7b8207b8b0ee6\": container with ID starting with e5eb125a1f1333c2c959936f98c7fe61404bcd6a6d702e4bb0e7b8207b8b0ee6 not found: ID does not exist" containerID="e5eb125a1f1333c2c959936f98c7fe61404bcd6a6d702e4bb0e7b8207b8b0ee6" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.748761 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5eb125a1f1333c2c959936f98c7fe61404bcd6a6d702e4bb0e7b8207b8b0ee6"} err="failed to get container status \"e5eb125a1f1333c2c959936f98c7fe61404bcd6a6d702e4bb0e7b8207b8b0ee6\": rpc error: code = NotFound desc = could not find container \"e5eb125a1f1333c2c959936f98c7fe61404bcd6a6d702e4bb0e7b8207b8b0ee6\": container with ID starting with e5eb125a1f1333c2c959936f98c7fe61404bcd6a6d702e4bb0e7b8207b8b0ee6 not found: ID does not exist" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.748780 4563 scope.go:117] "RemoveContainer" containerID="2888f7358774a3fccbfb2cfda7ab8cd22bbb1457aa038676a408e6e31aaa14e3" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.761019 4563 scope.go:117] "RemoveContainer" containerID="2888f7358774a3fccbfb2cfda7ab8cd22bbb1457aa038676a408e6e31aaa14e3" Nov 24 09:07:42 crc kubenswrapper[4563]: E1124 09:07:42.761454 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2888f7358774a3fccbfb2cfda7ab8cd22bbb1457aa038676a408e6e31aaa14e3\": container with ID starting with 2888f7358774a3fccbfb2cfda7ab8cd22bbb1457aa038676a408e6e31aaa14e3 not found: ID does not exist" containerID="2888f7358774a3fccbfb2cfda7ab8cd22bbb1457aa038676a408e6e31aaa14e3" Nov 24 09:07:42 crc kubenswrapper[4563]: I1124 09:07:42.761497 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2888f7358774a3fccbfb2cfda7ab8cd22bbb1457aa038676a408e6e31aaa14e3"} err="failed to get container status \"2888f7358774a3fccbfb2cfda7ab8cd22bbb1457aa038676a408e6e31aaa14e3\": rpc error: code = NotFound desc = could not find container \"2888f7358774a3fccbfb2cfda7ab8cd22bbb1457aa038676a408e6e31aaa14e3\": container with ID starting with 2888f7358774a3fccbfb2cfda7ab8cd22bbb1457aa038676a408e6e31aaa14e3 not found: ID does not exist" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.062432 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d7e769-e95c-468d-b220-1fae07708825" path="/var/lib/kubelet/pods/10d7e769-e95c-468d-b220-1fae07708825/volumes" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.063338 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" path="/var/lib/kubelet/pods/41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e/volumes" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.064983 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e13a5b1-f9f7-4045-952a-a44cfd536a99" path="/var/lib/kubelet/pods/4e13a5b1-f9f7-4045-952a-a44cfd536a99/volumes" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.065490 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8c919bf-e04d-4c09-84fa-064f434383bb" path="/var/lib/kubelet/pods/b8c919bf-e04d-4c09-84fa-064f434383bb/volumes" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.066131 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c852825f-7fac-45d3-b801-ae6eb253989a" path="/var/lib/kubelet/pods/c852825f-7fac-45d3-b801-ae6eb253989a/volumes" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.558899 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" event={"ID":"157ed1a3-ea31-4a6b-8e91-2852d4c50600","Type":"ContainerStarted","Data":"ee62fdd7bbb758721969cbf0ae1d7082a71a1f6324ed1085bba9c49e103e2240"} Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.558945 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" event={"ID":"157ed1a3-ea31-4a6b-8e91-2852d4c50600","Type":"ContainerStarted","Data":"393363fc0dd522450e0179d2e73c26b9d01acf3af6d72ca3b5961562dc193e35"} Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.559913 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.563680 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.577468 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5pxln" podStartSLOduration=2.577443426 podStartE2EDuration="2.577443426s" podCreationTimestamp="2025-11-24 09:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:07:43.57444978 +0000 UTC m=+240.833427226" watchObservedRunningTime="2025-11-24 09:07:43.577443426 +0000 UTC m=+240.836420862" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.799548 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nsq9v"] Nov 24 09:07:43 crc kubenswrapper[4563]: E1124 09:07:43.800301 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c919bf-e04d-4c09-84fa-064f434383bb" containerName="registry-server" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800330 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c919bf-e04d-4c09-84fa-064f434383bb" containerName="registry-server" Nov 24 09:07:43 crc kubenswrapper[4563]: E1124 09:07:43.800347 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" containerName="extract-content" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800354 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" containerName="extract-content" Nov 24 09:07:43 crc kubenswrapper[4563]: E1124 09:07:43.800364 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d7e769-e95c-468d-b220-1fae07708825" containerName="extract-content" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800372 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d7e769-e95c-468d-b220-1fae07708825" containerName="extract-content" Nov 24 09:07:43 crc kubenswrapper[4563]: E1124 09:07:43.800385 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" containerName="registry-server" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800392 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" containerName="registry-server" Nov 24 09:07:43 crc kubenswrapper[4563]: E1124 09:07:43.800403 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c919bf-e04d-4c09-84fa-064f434383bb" containerName="extract-content" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800410 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c919bf-e04d-4c09-84fa-064f434383bb" containerName="extract-content" Nov 24 09:07:43 crc kubenswrapper[4563]: E1124 09:07:43.800418 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d7e769-e95c-468d-b220-1fae07708825" containerName="registry-server" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800424 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d7e769-e95c-468d-b220-1fae07708825" containerName="registry-server" Nov 24 09:07:43 crc kubenswrapper[4563]: E1124 09:07:43.800433 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e13a5b1-f9f7-4045-952a-a44cfd536a99" containerName="marketplace-operator" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800439 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e13a5b1-f9f7-4045-952a-a44cfd536a99" containerName="marketplace-operator" Nov 24 09:07:43 crc kubenswrapper[4563]: E1124 09:07:43.800454 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c852825f-7fac-45d3-b801-ae6eb253989a" containerName="registry-server" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800460 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="c852825f-7fac-45d3-b801-ae6eb253989a" containerName="registry-server" Nov 24 09:07:43 crc kubenswrapper[4563]: E1124 09:07:43.800467 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d7e769-e95c-468d-b220-1fae07708825" containerName="extract-utilities" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800475 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d7e769-e95c-468d-b220-1fae07708825" containerName="extract-utilities" Nov 24 09:07:43 crc kubenswrapper[4563]: E1124 09:07:43.800483 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c852825f-7fac-45d3-b801-ae6eb253989a" containerName="extract-utilities" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800490 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="c852825f-7fac-45d3-b801-ae6eb253989a" containerName="extract-utilities" Nov 24 09:07:43 crc kubenswrapper[4563]: E1124 09:07:43.800497 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c919bf-e04d-4c09-84fa-064f434383bb" containerName="extract-utilities" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800506 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c919bf-e04d-4c09-84fa-064f434383bb" containerName="extract-utilities" Nov 24 09:07:43 crc kubenswrapper[4563]: E1124 09:07:43.800514 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c852825f-7fac-45d3-b801-ae6eb253989a" containerName="extract-content" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800520 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="c852825f-7fac-45d3-b801-ae6eb253989a" containerName="extract-content" Nov 24 09:07:43 crc kubenswrapper[4563]: E1124 09:07:43.800533 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" containerName="extract-utilities" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800539 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" containerName="extract-utilities" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800680 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d7e769-e95c-468d-b220-1fae07708825" containerName="registry-server" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800701 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e13a5b1-f9f7-4045-952a-a44cfd536a99" containerName="marketplace-operator" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800712 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="c852825f-7fac-45d3-b801-ae6eb253989a" containerName="registry-server" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800724 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8c919bf-e04d-4c09-84fa-064f434383bb" containerName="registry-server" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.800731 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ab3d1b-15bc-4d6c-b4b5-b73e11bef79e" containerName="registry-server" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.801746 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.807012 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.810548 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtdsc\" (UniqueName: \"kubernetes.io/projected/cb0770b7-f971-4e96-ab32-1f41b4cd9885-kube-api-access-rtdsc\") pod \"redhat-marketplace-nsq9v\" (UID: \"cb0770b7-f971-4e96-ab32-1f41b4cd9885\") " pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.810593 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb0770b7-f971-4e96-ab32-1f41b4cd9885-utilities\") pod \"redhat-marketplace-nsq9v\" (UID: \"cb0770b7-f971-4e96-ab32-1f41b4cd9885\") " pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.810722 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb0770b7-f971-4e96-ab32-1f41b4cd9885-catalog-content\") pod \"redhat-marketplace-nsq9v\" (UID: \"cb0770b7-f971-4e96-ab32-1f41b4cd9885\") " pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.810800 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nsq9v"] Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.912261 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb0770b7-f971-4e96-ab32-1f41b4cd9885-utilities\") pod \"redhat-marketplace-nsq9v\" (UID: \"cb0770b7-f971-4e96-ab32-1f41b4cd9885\") " pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.912414 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb0770b7-f971-4e96-ab32-1f41b4cd9885-catalog-content\") pod \"redhat-marketplace-nsq9v\" (UID: \"cb0770b7-f971-4e96-ab32-1f41b4cd9885\") " pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.912536 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtdsc\" (UniqueName: \"kubernetes.io/projected/cb0770b7-f971-4e96-ab32-1f41b4cd9885-kube-api-access-rtdsc\") pod \"redhat-marketplace-nsq9v\" (UID: \"cb0770b7-f971-4e96-ab32-1f41b4cd9885\") " pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.912736 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb0770b7-f971-4e96-ab32-1f41b4cd9885-utilities\") pod \"redhat-marketplace-nsq9v\" (UID: \"cb0770b7-f971-4e96-ab32-1f41b4cd9885\") " pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.912773 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb0770b7-f971-4e96-ab32-1f41b4cd9885-catalog-content\") pod \"redhat-marketplace-nsq9v\" (UID: \"cb0770b7-f971-4e96-ab32-1f41b4cd9885\") " pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:43 crc kubenswrapper[4563]: I1124 09:07:43.928196 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtdsc\" (UniqueName: \"kubernetes.io/projected/cb0770b7-f971-4e96-ab32-1f41b4cd9885-kube-api-access-rtdsc\") pod \"redhat-marketplace-nsq9v\" (UID: \"cb0770b7-f971-4e96-ab32-1f41b4cd9885\") " pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.116251 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.397232 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r6r55"] Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.398452 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.400364 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.409444 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r6r55"] Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.416864 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2704a13f-1433-4804-8818-e433c50beff1-catalog-content\") pod \"redhat-operators-r6r55\" (UID: \"2704a13f-1433-4804-8818-e433c50beff1\") " pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.416894 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgpr8\" (UniqueName: \"kubernetes.io/projected/2704a13f-1433-4804-8818-e433c50beff1-kube-api-access-pgpr8\") pod \"redhat-operators-r6r55\" (UID: \"2704a13f-1433-4804-8818-e433c50beff1\") " pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.416917 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2704a13f-1433-4804-8818-e433c50beff1-utilities\") pod \"redhat-operators-r6r55\" (UID: \"2704a13f-1433-4804-8818-e433c50beff1\") " pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.444738 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nsq9v"] Nov 24 09:07:44 crc kubenswrapper[4563]: W1124 09:07:44.449138 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb0770b7_f971_4e96_ab32_1f41b4cd9885.slice/crio-1e0f9e23224b978f1e8b4542fee0508b204b7e2adb49852f21876c51c849651d WatchSource:0}: Error finding container 1e0f9e23224b978f1e8b4542fee0508b204b7e2adb49852f21876c51c849651d: Status 404 returned error can't find the container with id 1e0f9e23224b978f1e8b4542fee0508b204b7e2adb49852f21876c51c849651d Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.517547 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2704a13f-1433-4804-8818-e433c50beff1-catalog-content\") pod \"redhat-operators-r6r55\" (UID: \"2704a13f-1433-4804-8818-e433c50beff1\") " pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.517580 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgpr8\" (UniqueName: \"kubernetes.io/projected/2704a13f-1433-4804-8818-e433c50beff1-kube-api-access-pgpr8\") pod \"redhat-operators-r6r55\" (UID: \"2704a13f-1433-4804-8818-e433c50beff1\") " pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.517603 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2704a13f-1433-4804-8818-e433c50beff1-utilities\") pod \"redhat-operators-r6r55\" (UID: \"2704a13f-1433-4804-8818-e433c50beff1\") " pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.517938 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2704a13f-1433-4804-8818-e433c50beff1-catalog-content\") pod \"redhat-operators-r6r55\" (UID: \"2704a13f-1433-4804-8818-e433c50beff1\") " pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.517981 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2704a13f-1433-4804-8818-e433c50beff1-utilities\") pod \"redhat-operators-r6r55\" (UID: \"2704a13f-1433-4804-8818-e433c50beff1\") " pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.533795 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgpr8\" (UniqueName: \"kubernetes.io/projected/2704a13f-1433-4804-8818-e433c50beff1-kube-api-access-pgpr8\") pod \"redhat-operators-r6r55\" (UID: \"2704a13f-1433-4804-8818-e433c50beff1\") " pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.563914 4563 generic.go:334] "Generic (PLEG): container finished" podID="cb0770b7-f971-4e96-ab32-1f41b4cd9885" containerID="4663418a437b95bedd144aeb9fadd6e22a9677ce84ee9dba36bc8adc31920e92" exitCode=0 Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.563985 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsq9v" event={"ID":"cb0770b7-f971-4e96-ab32-1f41b4cd9885","Type":"ContainerDied","Data":"4663418a437b95bedd144aeb9fadd6e22a9677ce84ee9dba36bc8adc31920e92"} Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.564023 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsq9v" event={"ID":"cb0770b7-f971-4e96-ab32-1f41b4cd9885","Type":"ContainerStarted","Data":"1e0f9e23224b978f1e8b4542fee0508b204b7e2adb49852f21876c51c849651d"} Nov 24 09:07:44 crc kubenswrapper[4563]: I1124 09:07:44.712707 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:45 crc kubenswrapper[4563]: I1124 09:07:45.040571 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r6r55"] Nov 24 09:07:45 crc kubenswrapper[4563]: I1124 09:07:45.569384 4563 generic.go:334] "Generic (PLEG): container finished" podID="cb0770b7-f971-4e96-ab32-1f41b4cd9885" containerID="8d31d49c69d3b84786d861863556107e2f26f273daa76eca98f1849d9933e283" exitCode=0 Nov 24 09:07:45 crc kubenswrapper[4563]: I1124 09:07:45.569436 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsq9v" event={"ID":"cb0770b7-f971-4e96-ab32-1f41b4cd9885","Type":"ContainerDied","Data":"8d31d49c69d3b84786d861863556107e2f26f273daa76eca98f1849d9933e283"} Nov 24 09:07:45 crc kubenswrapper[4563]: I1124 09:07:45.572738 4563 generic.go:334] "Generic (PLEG): container finished" podID="2704a13f-1433-4804-8818-e433c50beff1" containerID="220ae75c298fb1ac368ca98f4b5b5c18864c721087fb935d1b679336694d8be5" exitCode=0 Nov 24 09:07:45 crc kubenswrapper[4563]: I1124 09:07:45.572998 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6r55" event={"ID":"2704a13f-1433-4804-8818-e433c50beff1","Type":"ContainerDied","Data":"220ae75c298fb1ac368ca98f4b5b5c18864c721087fb935d1b679336694d8be5"} Nov 24 09:07:45 crc kubenswrapper[4563]: I1124 09:07:45.573042 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6r55" event={"ID":"2704a13f-1433-4804-8818-e433c50beff1","Type":"ContainerStarted","Data":"8ddc29984564c821f8ff87fc4f5f775f8ca363365a56521da52a3dc0faf9db87"} Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.197240 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sf9ml"] Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.198519 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.200054 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.203785 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sf9ml"] Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.236186 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xk4t\" (UniqueName: \"kubernetes.io/projected/83e0ce45-d845-49ab-b393-af085b920737-kube-api-access-5xk4t\") pod \"certified-operators-sf9ml\" (UID: \"83e0ce45-d845-49ab-b393-af085b920737\") " pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.236238 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83e0ce45-d845-49ab-b393-af085b920737-catalog-content\") pod \"certified-operators-sf9ml\" (UID: \"83e0ce45-d845-49ab-b393-af085b920737\") " pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.236341 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83e0ce45-d845-49ab-b393-af085b920737-utilities\") pod \"certified-operators-sf9ml\" (UID: \"83e0ce45-d845-49ab-b393-af085b920737\") " pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.337319 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83e0ce45-d845-49ab-b393-af085b920737-utilities\") pod \"certified-operators-sf9ml\" (UID: \"83e0ce45-d845-49ab-b393-af085b920737\") " pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.337475 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xk4t\" (UniqueName: \"kubernetes.io/projected/83e0ce45-d845-49ab-b393-af085b920737-kube-api-access-5xk4t\") pod \"certified-operators-sf9ml\" (UID: \"83e0ce45-d845-49ab-b393-af085b920737\") " pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.337566 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83e0ce45-d845-49ab-b393-af085b920737-catalog-content\") pod \"certified-operators-sf9ml\" (UID: \"83e0ce45-d845-49ab-b393-af085b920737\") " pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.337941 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83e0ce45-d845-49ab-b393-af085b920737-utilities\") pod \"certified-operators-sf9ml\" (UID: \"83e0ce45-d845-49ab-b393-af085b920737\") " pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.338039 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83e0ce45-d845-49ab-b393-af085b920737-catalog-content\") pod \"certified-operators-sf9ml\" (UID: \"83e0ce45-d845-49ab-b393-af085b920737\") " pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.353302 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xk4t\" (UniqueName: \"kubernetes.io/projected/83e0ce45-d845-49ab-b393-af085b920737-kube-api-access-5xk4t\") pod \"certified-operators-sf9ml\" (UID: \"83e0ce45-d845-49ab-b393-af085b920737\") " pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.524713 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.581731 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6r55" event={"ID":"2704a13f-1433-4804-8818-e433c50beff1","Type":"ContainerStarted","Data":"5de5020a45d59de2be95c0099ba39e7e4646bfc7d66cbcf5a35f369521aa9761"} Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.584823 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nsq9v" event={"ID":"cb0770b7-f971-4e96-ab32-1f41b4cd9885","Type":"ContainerStarted","Data":"77080a5bb45fd3e4fd8eaaf9f3ece1c4774cb5b41860a3e26eb285fd9c17e4d8"} Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.613147 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nsq9v" podStartSLOduration=2.026460702 podStartE2EDuration="3.613130919s" podCreationTimestamp="2025-11-24 09:07:43 +0000 UTC" firstStartedPulling="2025-11-24 09:07:44.565052767 +0000 UTC m=+241.824030215" lastFinishedPulling="2025-11-24 09:07:46.151722985 +0000 UTC m=+243.410700432" observedRunningTime="2025-11-24 09:07:46.611160482 +0000 UTC m=+243.870137928" watchObservedRunningTime="2025-11-24 09:07:46.613130919 +0000 UTC m=+243.872108365" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.798385 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6cgfm"] Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.799408 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.800681 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.806412 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6cgfm"] Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.841742 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f738ab-b1e0-4e09-baf3-3ed15d54151c-catalog-content\") pod \"community-operators-6cgfm\" (UID: \"48f738ab-b1e0-4e09-baf3-3ed15d54151c\") " pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.841811 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f738ab-b1e0-4e09-baf3-3ed15d54151c-utilities\") pod \"community-operators-6cgfm\" (UID: \"48f738ab-b1e0-4e09-baf3-3ed15d54151c\") " pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.841841 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8b79\" (UniqueName: \"kubernetes.io/projected/48f738ab-b1e0-4e09-baf3-3ed15d54151c-kube-api-access-w8b79\") pod \"community-operators-6cgfm\" (UID: \"48f738ab-b1e0-4e09-baf3-3ed15d54151c\") " pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.867483 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sf9ml"] Nov 24 09:07:46 crc kubenswrapper[4563]: W1124 09:07:46.904994 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83e0ce45_d845_49ab_b393_af085b920737.slice/crio-fe40417251def13837b971593a80bd4a99ed6e84e33b9f75bdb0b659537e037e WatchSource:0}: Error finding container fe40417251def13837b971593a80bd4a99ed6e84e33b9f75bdb0b659537e037e: Status 404 returned error can't find the container with id fe40417251def13837b971593a80bd4a99ed6e84e33b9f75bdb0b659537e037e Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.942670 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f738ab-b1e0-4e09-baf3-3ed15d54151c-catalog-content\") pod \"community-operators-6cgfm\" (UID: \"48f738ab-b1e0-4e09-baf3-3ed15d54151c\") " pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.942753 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f738ab-b1e0-4e09-baf3-3ed15d54151c-utilities\") pod \"community-operators-6cgfm\" (UID: \"48f738ab-b1e0-4e09-baf3-3ed15d54151c\") " pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.942785 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8b79\" (UniqueName: \"kubernetes.io/projected/48f738ab-b1e0-4e09-baf3-3ed15d54151c-kube-api-access-w8b79\") pod \"community-operators-6cgfm\" (UID: \"48f738ab-b1e0-4e09-baf3-3ed15d54151c\") " pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.943054 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f738ab-b1e0-4e09-baf3-3ed15d54151c-catalog-content\") pod \"community-operators-6cgfm\" (UID: \"48f738ab-b1e0-4e09-baf3-3ed15d54151c\") " pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.943159 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f738ab-b1e0-4e09-baf3-3ed15d54151c-utilities\") pod \"community-operators-6cgfm\" (UID: \"48f738ab-b1e0-4e09-baf3-3ed15d54151c\") " pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:07:46 crc kubenswrapper[4563]: I1124 09:07:46.960348 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8b79\" (UniqueName: \"kubernetes.io/projected/48f738ab-b1e0-4e09-baf3-3ed15d54151c-kube-api-access-w8b79\") pod \"community-operators-6cgfm\" (UID: \"48f738ab-b1e0-4e09-baf3-3ed15d54151c\") " pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:07:47 crc kubenswrapper[4563]: I1124 09:07:47.126200 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:07:47 crc kubenswrapper[4563]: I1124 09:07:47.458128 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6cgfm"] Nov 24 09:07:47 crc kubenswrapper[4563]: W1124 09:07:47.467011 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f738ab_b1e0_4e09_baf3_3ed15d54151c.slice/crio-3b4595c8ef789a0893a812ea8f5103a8019370dee76c7cb91abf36240e3f3b30 WatchSource:0}: Error finding container 3b4595c8ef789a0893a812ea8f5103a8019370dee76c7cb91abf36240e3f3b30: Status 404 returned error can't find the container with id 3b4595c8ef789a0893a812ea8f5103a8019370dee76c7cb91abf36240e3f3b30 Nov 24 09:07:47 crc kubenswrapper[4563]: I1124 09:07:47.590146 4563 generic.go:334] "Generic (PLEG): container finished" podID="83e0ce45-d845-49ab-b393-af085b920737" containerID="60ef3808f2526069ab50376dd58037b2103401490aa27dcf88406fc64dc98ac4" exitCode=0 Nov 24 09:07:47 crc kubenswrapper[4563]: I1124 09:07:47.590340 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf9ml" event={"ID":"83e0ce45-d845-49ab-b393-af085b920737","Type":"ContainerDied","Data":"60ef3808f2526069ab50376dd58037b2103401490aa27dcf88406fc64dc98ac4"} Nov 24 09:07:47 crc kubenswrapper[4563]: I1124 09:07:47.590375 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf9ml" event={"ID":"83e0ce45-d845-49ab-b393-af085b920737","Type":"ContainerStarted","Data":"fe40417251def13837b971593a80bd4a99ed6e84e33b9f75bdb0b659537e037e"} Nov 24 09:07:47 crc kubenswrapper[4563]: I1124 09:07:47.592517 4563 generic.go:334] "Generic (PLEG): container finished" podID="48f738ab-b1e0-4e09-baf3-3ed15d54151c" containerID="cfbb8d29e7db0ec07b0ce034dd040e9781a11199280e83d38a03a4bdecb26128" exitCode=0 Nov 24 09:07:47 crc kubenswrapper[4563]: I1124 09:07:47.592594 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cgfm" event={"ID":"48f738ab-b1e0-4e09-baf3-3ed15d54151c","Type":"ContainerDied","Data":"cfbb8d29e7db0ec07b0ce034dd040e9781a11199280e83d38a03a4bdecb26128"} Nov 24 09:07:47 crc kubenswrapper[4563]: I1124 09:07:47.592631 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cgfm" event={"ID":"48f738ab-b1e0-4e09-baf3-3ed15d54151c","Type":"ContainerStarted","Data":"3b4595c8ef789a0893a812ea8f5103a8019370dee76c7cb91abf36240e3f3b30"} Nov 24 09:07:47 crc kubenswrapper[4563]: I1124 09:07:47.595020 4563 generic.go:334] "Generic (PLEG): container finished" podID="2704a13f-1433-4804-8818-e433c50beff1" containerID="5de5020a45d59de2be95c0099ba39e7e4646bfc7d66cbcf5a35f369521aa9761" exitCode=0 Nov 24 09:07:47 crc kubenswrapper[4563]: I1124 09:07:47.595120 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6r55" event={"ID":"2704a13f-1433-4804-8818-e433c50beff1","Type":"ContainerDied","Data":"5de5020a45d59de2be95c0099ba39e7e4646bfc7d66cbcf5a35f369521aa9761"} Nov 24 09:07:49 crc kubenswrapper[4563]: I1124 09:07:49.606352 4563 generic.go:334] "Generic (PLEG): container finished" podID="48f738ab-b1e0-4e09-baf3-3ed15d54151c" containerID="88f2830b622e4cb60e2c23fd92d2b8ad9bfa7d1bc6bdf02cf1ea299095d04720" exitCode=0 Nov 24 09:07:49 crc kubenswrapper[4563]: I1124 09:07:49.606440 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cgfm" event={"ID":"48f738ab-b1e0-4e09-baf3-3ed15d54151c","Type":"ContainerDied","Data":"88f2830b622e4cb60e2c23fd92d2b8ad9bfa7d1bc6bdf02cf1ea299095d04720"} Nov 24 09:07:49 crc kubenswrapper[4563]: I1124 09:07:49.615478 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6r55" event={"ID":"2704a13f-1433-4804-8818-e433c50beff1","Type":"ContainerStarted","Data":"92d72b407fd77954e41bd4d805f09771f3a11466e8cc0d004866b72dfb651b10"} Nov 24 09:07:49 crc kubenswrapper[4563]: I1124 09:07:49.617782 4563 generic.go:334] "Generic (PLEG): container finished" podID="83e0ce45-d845-49ab-b393-af085b920737" containerID="7db47449cf3a599bc8bb882cb893487c4a044745a74aea6fbbc6321888d2dceb" exitCode=0 Nov 24 09:07:49 crc kubenswrapper[4563]: I1124 09:07:49.617830 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf9ml" event={"ID":"83e0ce45-d845-49ab-b393-af085b920737","Type":"ContainerDied","Data":"7db47449cf3a599bc8bb882cb893487c4a044745a74aea6fbbc6321888d2dceb"} Nov 24 09:07:49 crc kubenswrapper[4563]: I1124 09:07:49.658602 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r6r55" podStartSLOduration=3.164904056 podStartE2EDuration="5.658583368s" podCreationTimestamp="2025-11-24 09:07:44 +0000 UTC" firstStartedPulling="2025-11-24 09:07:45.574102206 +0000 UTC m=+242.833079653" lastFinishedPulling="2025-11-24 09:07:48.067781519 +0000 UTC m=+245.326758965" observedRunningTime="2025-11-24 09:07:49.653030475 +0000 UTC m=+246.912007932" watchObservedRunningTime="2025-11-24 09:07:49.658583368 +0000 UTC m=+246.917560815" Nov 24 09:07:50 crc kubenswrapper[4563]: I1124 09:07:50.623783 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6cgfm" event={"ID":"48f738ab-b1e0-4e09-baf3-3ed15d54151c","Type":"ContainerStarted","Data":"fcd2c987f50a50ba6ce6c9353e77607af0d5efec5eda7b2fc81ff0760a379f54"} Nov 24 09:07:50 crc kubenswrapper[4563]: I1124 09:07:50.625450 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sf9ml" event={"ID":"83e0ce45-d845-49ab-b393-af085b920737","Type":"ContainerStarted","Data":"34a10b4299864236890c3e4a6b4421977a83091a7871ac6d279950a61626f9e2"} Nov 24 09:07:50 crc kubenswrapper[4563]: I1124 09:07:50.651975 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sf9ml" podStartSLOduration=2.133689765 podStartE2EDuration="4.651960709s" podCreationTimestamp="2025-11-24 09:07:46 +0000 UTC" firstStartedPulling="2025-11-24 09:07:47.591218972 +0000 UTC m=+244.850196419" lastFinishedPulling="2025-11-24 09:07:50.109489915 +0000 UTC m=+247.368467363" observedRunningTime="2025-11-24 09:07:50.650739175 +0000 UTC m=+247.909716621" watchObservedRunningTime="2025-11-24 09:07:50.651960709 +0000 UTC m=+247.910938156" Nov 24 09:07:50 crc kubenswrapper[4563]: I1124 09:07:50.652618 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6cgfm" podStartSLOduration=2.169558566 podStartE2EDuration="4.652612478s" podCreationTimestamp="2025-11-24 09:07:46 +0000 UTC" firstStartedPulling="2025-11-24 09:07:47.594094886 +0000 UTC m=+244.853072334" lastFinishedPulling="2025-11-24 09:07:50.077148798 +0000 UTC m=+247.336126246" observedRunningTime="2025-11-24 09:07:50.640326697 +0000 UTC m=+247.899304143" watchObservedRunningTime="2025-11-24 09:07:50.652612478 +0000 UTC m=+247.911589926" Nov 24 09:07:54 crc kubenswrapper[4563]: I1124 09:07:54.116759 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:54 crc kubenswrapper[4563]: I1124 09:07:54.117238 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:54 crc kubenswrapper[4563]: I1124 09:07:54.147865 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:54 crc kubenswrapper[4563]: I1124 09:07:54.668973 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nsq9v" Nov 24 09:07:54 crc kubenswrapper[4563]: I1124 09:07:54.712913 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:54 crc kubenswrapper[4563]: I1124 09:07:54.714277 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:54 crc kubenswrapper[4563]: I1124 09:07:54.747652 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:55 crc kubenswrapper[4563]: I1124 09:07:55.674589 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:07:56 crc kubenswrapper[4563]: I1124 09:07:56.525267 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:56 crc kubenswrapper[4563]: I1124 09:07:56.525339 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:56 crc kubenswrapper[4563]: I1124 09:07:56.551858 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:56 crc kubenswrapper[4563]: I1124 09:07:56.672728 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sf9ml" Nov 24 09:07:57 crc kubenswrapper[4563]: I1124 09:07:57.126894 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:07:57 crc kubenswrapper[4563]: I1124 09:07:57.127475 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:07:57 crc kubenswrapper[4563]: I1124 09:07:57.153231 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:07:57 crc kubenswrapper[4563]: I1124 09:07:57.679813 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6cgfm" Nov 24 09:09:38 crc kubenswrapper[4563]: I1124 09:09:38.987304 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:09:38 crc kubenswrapper[4563]: I1124 09:09:38.987996 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:10:08 crc kubenswrapper[4563]: I1124 09:10:08.988071 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:10:08 crc kubenswrapper[4563]: I1124 09:10:08.988391 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:10:38 crc kubenswrapper[4563]: I1124 09:10:38.987768 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:10:38 crc kubenswrapper[4563]: I1124 09:10:38.988157 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:10:38 crc kubenswrapper[4563]: I1124 09:10:38.988196 4563 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:10:38 crc kubenswrapper[4563]: I1124 09:10:38.988546 4563 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"693aaa2fd38048eca425d5cf8bf8e834a76fe87db5bd736efd4bf0270f272397"} pod="openshift-machine-config-operator/machine-config-daemon-stlxr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 09:10:38 crc kubenswrapper[4563]: I1124 09:10:38.988591 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" containerID="cri-o://693aaa2fd38048eca425d5cf8bf8e834a76fe87db5bd736efd4bf0270f272397" gracePeriod=600 Nov 24 09:10:39 crc kubenswrapper[4563]: I1124 09:10:39.261893 4563 generic.go:334] "Generic (PLEG): container finished" podID="3b2bfe55-8989-49b3-bb61-e28189447627" containerID="693aaa2fd38048eca425d5cf8bf8e834a76fe87db5bd736efd4bf0270f272397" exitCode=0 Nov 24 09:10:39 crc kubenswrapper[4563]: I1124 09:10:39.261955 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerDied","Data":"693aaa2fd38048eca425d5cf8bf8e834a76fe87db5bd736efd4bf0270f272397"} Nov 24 09:10:39 crc kubenswrapper[4563]: I1124 09:10:39.262130 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"70f9c1996df47923e8f844e33d7bfcc71ba8679e22cd73a43ba1a4424e2bc93b"} Nov 24 09:10:39 crc kubenswrapper[4563]: I1124 09:10:39.262151 4563 scope.go:117] "RemoveContainer" containerID="6fdfbd9de15b8432813bba1d07388a4dc4ef9b023937c4e0e2a6ab44982991cd" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.709188 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w6z6s"] Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.710106 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.717756 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w6z6s"] Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.862678 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f925e1d-9382-498f-ab9c-70f89ee0b47e-trusted-ca\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.862745 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f925e1d-9382-498f-ab9c-70f89ee0b47e-registry-tls\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.862810 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f925e1d-9382-498f-ab9c-70f89ee0b47e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.862849 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.862885 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f925e1d-9382-498f-ab9c-70f89ee0b47e-registry-certificates\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.862930 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f925e1d-9382-498f-ab9c-70f89ee0b47e-bound-sa-token\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.862972 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqh55\" (UniqueName: \"kubernetes.io/projected/2f925e1d-9382-498f-ab9c-70f89ee0b47e-kube-api-access-zqh55\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.863107 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f925e1d-9382-498f-ab9c-70f89ee0b47e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.878006 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.964294 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f925e1d-9382-498f-ab9c-70f89ee0b47e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.964336 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f925e1d-9382-498f-ab9c-70f89ee0b47e-registry-certificates\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.964369 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f925e1d-9382-498f-ab9c-70f89ee0b47e-bound-sa-token\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.964390 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqh55\" (UniqueName: \"kubernetes.io/projected/2f925e1d-9382-498f-ab9c-70f89ee0b47e-kube-api-access-zqh55\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.964414 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f925e1d-9382-498f-ab9c-70f89ee0b47e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.964432 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f925e1d-9382-498f-ab9c-70f89ee0b47e-trusted-ca\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.964456 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f925e1d-9382-498f-ab9c-70f89ee0b47e-registry-tls\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.964783 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2f925e1d-9382-498f-ab9c-70f89ee0b47e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.965494 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f925e1d-9382-498f-ab9c-70f89ee0b47e-trusted-ca\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.965522 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2f925e1d-9382-498f-ab9c-70f89ee0b47e-registry-certificates\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.968965 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2f925e1d-9382-498f-ab9c-70f89ee0b47e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.968994 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2f925e1d-9382-498f-ab9c-70f89ee0b47e-registry-tls\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.977076 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f925e1d-9382-498f-ab9c-70f89ee0b47e-bound-sa-token\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:05 crc kubenswrapper[4563]: I1124 09:11:05.978098 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqh55\" (UniqueName: \"kubernetes.io/projected/2f925e1d-9382-498f-ab9c-70f89ee0b47e-kube-api-access-zqh55\") pod \"image-registry-66df7c8f76-w6z6s\" (UID: \"2f925e1d-9382-498f-ab9c-70f89ee0b47e\") " pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:06 crc kubenswrapper[4563]: I1124 09:11:06.021331 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:06 crc kubenswrapper[4563]: I1124 09:11:06.345139 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w6z6s"] Nov 24 09:11:06 crc kubenswrapper[4563]: I1124 09:11:06.363507 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" event={"ID":"2f925e1d-9382-498f-ab9c-70f89ee0b47e","Type":"ContainerStarted","Data":"c427577d399acb88cc1ef15a821d90ca0dcffb5ef30825a0849257861fc7b761"} Nov 24 09:11:07 crc kubenswrapper[4563]: I1124 09:11:07.369132 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" event={"ID":"2f925e1d-9382-498f-ab9c-70f89ee0b47e","Type":"ContainerStarted","Data":"f47c17edaceb24645d5dc44f52172b621a70658e3d369f1486a094ab6131a750"} Nov 24 09:11:07 crc kubenswrapper[4563]: I1124 09:11:07.369457 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:07 crc kubenswrapper[4563]: I1124 09:11:07.386079 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" podStartSLOduration=2.386065535 podStartE2EDuration="2.386065535s" podCreationTimestamp="2025-11-24 09:11:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:11:07.383059247 +0000 UTC m=+444.642036694" watchObservedRunningTime="2025-11-24 09:11:07.386065535 +0000 UTC m=+444.645042983" Nov 24 09:11:26 crc kubenswrapper[4563]: I1124 09:11:26.026539 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-w6z6s" Nov 24 09:11:26 crc kubenswrapper[4563]: I1124 09:11:26.066881 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mm4b8"] Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.093016 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" podUID="4952a751-3601-4381-9b92-d5d720b6dca2" containerName="registry" containerID="cri-o://5e7395a329c40d1d263e6ebd6b5c7f6cb277a303f2475f4549d0a8005a04e9c5" gracePeriod=30 Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.367773 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.529757 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4952a751-3601-4381-9b92-d5d720b6dca2-trusted-ca\") pod \"4952a751-3601-4381-9b92-d5d720b6dca2\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.529818 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-bound-sa-token\") pod \"4952a751-3601-4381-9b92-d5d720b6dca2\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.529850 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-registry-tls\") pod \"4952a751-3601-4381-9b92-d5d720b6dca2\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.529868 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4952a751-3601-4381-9b92-d5d720b6dca2-registry-certificates\") pod \"4952a751-3601-4381-9b92-d5d720b6dca2\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.530048 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4952a751-3601-4381-9b92-d5d720b6dca2\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.530268 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4952a751-3601-4381-9b92-d5d720b6dca2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4952a751-3601-4381-9b92-d5d720b6dca2" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.530414 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmd6r\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-kube-api-access-pmd6r\") pod \"4952a751-3601-4381-9b92-d5d720b6dca2\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.530464 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4952a751-3601-4381-9b92-d5d720b6dca2-ca-trust-extracted\") pod \"4952a751-3601-4381-9b92-d5d720b6dca2\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.530662 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4952a751-3601-4381-9b92-d5d720b6dca2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4952a751-3601-4381-9b92-d5d720b6dca2" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.531226 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4952a751-3601-4381-9b92-d5d720b6dca2-installation-pull-secrets\") pod \"4952a751-3601-4381-9b92-d5d720b6dca2\" (UID: \"4952a751-3601-4381-9b92-d5d720b6dca2\") " Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.531769 4563 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4952a751-3601-4381-9b92-d5d720b6dca2-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.531784 4563 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4952a751-3601-4381-9b92-d5d720b6dca2-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.535920 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4952a751-3601-4381-9b92-d5d720b6dca2" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.536231 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4952a751-3601-4381-9b92-d5d720b6dca2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4952a751-3601-4381-9b92-d5d720b6dca2" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.536476 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-kube-api-access-pmd6r" (OuterVolumeSpecName: "kube-api-access-pmd6r") pod "4952a751-3601-4381-9b92-d5d720b6dca2" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2"). InnerVolumeSpecName "kube-api-access-pmd6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.536712 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4952a751-3601-4381-9b92-d5d720b6dca2" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.537529 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4952a751-3601-4381-9b92-d5d720b6dca2" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.555587 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4952a751-3601-4381-9b92-d5d720b6dca2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4952a751-3601-4381-9b92-d5d720b6dca2" (UID: "4952a751-3601-4381-9b92-d5d720b6dca2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.580771 4563 generic.go:334] "Generic (PLEG): container finished" podID="4952a751-3601-4381-9b92-d5d720b6dca2" containerID="5e7395a329c40d1d263e6ebd6b5c7f6cb277a303f2475f4549d0a8005a04e9c5" exitCode=0 Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.580817 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" event={"ID":"4952a751-3601-4381-9b92-d5d720b6dca2","Type":"ContainerDied","Data":"5e7395a329c40d1d263e6ebd6b5c7f6cb277a303f2475f4549d0a8005a04e9c5"} Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.580882 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" event={"ID":"4952a751-3601-4381-9b92-d5d720b6dca2","Type":"ContainerDied","Data":"d71b3d4a3bc469b9cc6a65bf99e533c23e49dba8bf903e500afddfed5fb03347"} Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.580882 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mm4b8" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.580972 4563 scope.go:117] "RemoveContainer" containerID="5e7395a329c40d1d263e6ebd6b5c7f6cb277a303f2475f4549d0a8005a04e9c5" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.600436 4563 scope.go:117] "RemoveContainer" containerID="5e7395a329c40d1d263e6ebd6b5c7f6cb277a303f2475f4549d0a8005a04e9c5" Nov 24 09:11:51 crc kubenswrapper[4563]: E1124 09:11:51.600900 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e7395a329c40d1d263e6ebd6b5c7f6cb277a303f2475f4549d0a8005a04e9c5\": container with ID starting with 5e7395a329c40d1d263e6ebd6b5c7f6cb277a303f2475f4549d0a8005a04e9c5 not found: ID does not exist" containerID="5e7395a329c40d1d263e6ebd6b5c7f6cb277a303f2475f4549d0a8005a04e9c5" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.600926 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7395a329c40d1d263e6ebd6b5c7f6cb277a303f2475f4549d0a8005a04e9c5"} err="failed to get container status \"5e7395a329c40d1d263e6ebd6b5c7f6cb277a303f2475f4549d0a8005a04e9c5\": rpc error: code = NotFound desc = could not find container \"5e7395a329c40d1d263e6ebd6b5c7f6cb277a303f2475f4549d0a8005a04e9c5\": container with ID starting with 5e7395a329c40d1d263e6ebd6b5c7f6cb277a303f2475f4549d0a8005a04e9c5 not found: ID does not exist" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.602011 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mm4b8"] Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.606572 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mm4b8"] Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.632680 4563 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.632762 4563 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.632834 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmd6r\" (UniqueName: \"kubernetes.io/projected/4952a751-3601-4381-9b92-d5d720b6dca2-kube-api-access-pmd6r\") on node \"crc\" DevicePath \"\"" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.632889 4563 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4952a751-3601-4381-9b92-d5d720b6dca2-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 24 09:11:51 crc kubenswrapper[4563]: I1124 09:11:51.632949 4563 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4952a751-3601-4381-9b92-d5d720b6dca2-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 24 09:11:53 crc kubenswrapper[4563]: I1124 09:11:53.060559 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4952a751-3601-4381-9b92-d5d720b6dca2" path="/var/lib/kubelet/pods/4952a751-3601-4381-9b92-d5d720b6dca2/volumes" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.688252 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mfw5p"] Nov 24 09:12:28 crc kubenswrapper[4563]: E1124 09:12:28.689008 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4952a751-3601-4381-9b92-d5d720b6dca2" containerName="registry" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.689027 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4952a751-3601-4381-9b92-d5d720b6dca2" containerName="registry" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.689117 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="4952a751-3601-4381-9b92-d5d720b6dca2" containerName="registry" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.689423 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-mfw5p" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.691483 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.691580 4563 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-kthnp" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.691784 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.698648 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-bpkp8"] Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.699481 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-bpkp8" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.701011 4563 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-xzsbp" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.709876 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mfw5p"] Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.712389 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-p6b8w"] Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.712990 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-p6b8w" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.715067 4563 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-tftxn" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.719051 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-bpkp8"] Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.725915 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-p6b8w"] Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.818678 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25dv6\" (UniqueName: \"kubernetes.io/projected/dbb0b52d-058f-46a3-8342-811bd3f5b495-kube-api-access-25dv6\") pod \"cert-manager-cainjector-7f985d654d-mfw5p\" (UID: \"dbb0b52d-058f-46a3-8342-811bd3f5b495\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mfw5p" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.818799 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d97x\" (UniqueName: \"kubernetes.io/projected/a32e3f9f-14d2-44fb-ba5a-9ede6e568643-kube-api-access-2d97x\") pod \"cert-manager-webhook-5655c58dd6-p6b8w\" (UID: \"a32e3f9f-14d2-44fb-ba5a-9ede6e568643\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-p6b8w" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.819149 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhqn7\" (UniqueName: \"kubernetes.io/projected/271399d1-6304-4dcd-a3df-6c543849329e-kube-api-access-bhqn7\") pod \"cert-manager-5b446d88c5-bpkp8\" (UID: \"271399d1-6304-4dcd-a3df-6c543849329e\") " pod="cert-manager/cert-manager-5b446d88c5-bpkp8" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.920952 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhqn7\" (UniqueName: \"kubernetes.io/projected/271399d1-6304-4dcd-a3df-6c543849329e-kube-api-access-bhqn7\") pod \"cert-manager-5b446d88c5-bpkp8\" (UID: \"271399d1-6304-4dcd-a3df-6c543849329e\") " pod="cert-manager/cert-manager-5b446d88c5-bpkp8" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.921043 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25dv6\" (UniqueName: \"kubernetes.io/projected/dbb0b52d-058f-46a3-8342-811bd3f5b495-kube-api-access-25dv6\") pod \"cert-manager-cainjector-7f985d654d-mfw5p\" (UID: \"dbb0b52d-058f-46a3-8342-811bd3f5b495\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mfw5p" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.921093 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d97x\" (UniqueName: \"kubernetes.io/projected/a32e3f9f-14d2-44fb-ba5a-9ede6e568643-kube-api-access-2d97x\") pod \"cert-manager-webhook-5655c58dd6-p6b8w\" (UID: \"a32e3f9f-14d2-44fb-ba5a-9ede6e568643\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-p6b8w" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.937253 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d97x\" (UniqueName: \"kubernetes.io/projected/a32e3f9f-14d2-44fb-ba5a-9ede6e568643-kube-api-access-2d97x\") pod \"cert-manager-webhook-5655c58dd6-p6b8w\" (UID: \"a32e3f9f-14d2-44fb-ba5a-9ede6e568643\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-p6b8w" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.937349 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhqn7\" (UniqueName: \"kubernetes.io/projected/271399d1-6304-4dcd-a3df-6c543849329e-kube-api-access-bhqn7\") pod \"cert-manager-5b446d88c5-bpkp8\" (UID: \"271399d1-6304-4dcd-a3df-6c543849329e\") " pod="cert-manager/cert-manager-5b446d88c5-bpkp8" Nov 24 09:12:28 crc kubenswrapper[4563]: I1124 09:12:28.938313 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25dv6\" (UniqueName: \"kubernetes.io/projected/dbb0b52d-058f-46a3-8342-811bd3f5b495-kube-api-access-25dv6\") pod \"cert-manager-cainjector-7f985d654d-mfw5p\" (UID: \"dbb0b52d-058f-46a3-8342-811bd3f5b495\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mfw5p" Nov 24 09:12:29 crc kubenswrapper[4563]: I1124 09:12:29.006485 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-mfw5p" Nov 24 09:12:29 crc kubenswrapper[4563]: I1124 09:12:29.014804 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-bpkp8" Nov 24 09:12:29 crc kubenswrapper[4563]: I1124 09:12:29.025826 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-p6b8w" Nov 24 09:12:29 crc kubenswrapper[4563]: I1124 09:12:29.166044 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mfw5p"] Nov 24 09:12:29 crc kubenswrapper[4563]: I1124 09:12:29.176416 4563 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 09:12:29 crc kubenswrapper[4563]: I1124 09:12:29.412582 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-bpkp8"] Nov 24 09:12:29 crc kubenswrapper[4563]: I1124 09:12:29.418101 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-p6b8w"] Nov 24 09:12:29 crc kubenswrapper[4563]: W1124 09:12:29.418770 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod271399d1_6304_4dcd_a3df_6c543849329e.slice/crio-0e008a1ecf1fa011630e68ef0905742df9bb34e01aa48bccb9a04858b1e0593d WatchSource:0}: Error finding container 0e008a1ecf1fa011630e68ef0905742df9bb34e01aa48bccb9a04858b1e0593d: Status 404 returned error can't find the container with id 0e008a1ecf1fa011630e68ef0905742df9bb34e01aa48bccb9a04858b1e0593d Nov 24 09:12:29 crc kubenswrapper[4563]: W1124 09:12:29.420406 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda32e3f9f_14d2_44fb_ba5a_9ede6e568643.slice/crio-1e71a6ee8ca6865de21d030840247eaef6e999bfedf4836a668297435c1b7d2a WatchSource:0}: Error finding container 1e71a6ee8ca6865de21d030840247eaef6e999bfedf4836a668297435c1b7d2a: Status 404 returned error can't find the container with id 1e71a6ee8ca6865de21d030840247eaef6e999bfedf4836a668297435c1b7d2a Nov 24 09:12:29 crc kubenswrapper[4563]: I1124 09:12:29.784755 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-p6b8w" event={"ID":"a32e3f9f-14d2-44fb-ba5a-9ede6e568643","Type":"ContainerStarted","Data":"1e71a6ee8ca6865de21d030840247eaef6e999bfedf4836a668297435c1b7d2a"} Nov 24 09:12:29 crc kubenswrapper[4563]: I1124 09:12:29.785989 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-bpkp8" event={"ID":"271399d1-6304-4dcd-a3df-6c543849329e","Type":"ContainerStarted","Data":"0e008a1ecf1fa011630e68ef0905742df9bb34e01aa48bccb9a04858b1e0593d"} Nov 24 09:12:29 crc kubenswrapper[4563]: I1124 09:12:29.789822 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-mfw5p" event={"ID":"dbb0b52d-058f-46a3-8342-811bd3f5b495","Type":"ContainerStarted","Data":"9a2734801aeb1ca9f21ee6d84d8441c5c61eee5519a3ac497da61256ad96009e"} Nov 24 09:12:31 crc kubenswrapper[4563]: I1124 09:12:31.799451 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-mfw5p" event={"ID":"dbb0b52d-058f-46a3-8342-811bd3f5b495","Type":"ContainerStarted","Data":"d0b2f3761e0bbdc07191da01e4419526123ef436e2cdf7269e60a86da2317952"} Nov 24 09:12:31 crc kubenswrapper[4563]: I1124 09:12:31.812620 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-mfw5p" podStartSLOduration=2.21005874 podStartE2EDuration="3.812605502s" podCreationTimestamp="2025-11-24 09:12:28 +0000 UTC" firstStartedPulling="2025-11-24 09:12:29.176160966 +0000 UTC m=+526.435138413" lastFinishedPulling="2025-11-24 09:12:30.778707738 +0000 UTC m=+528.037685175" observedRunningTime="2025-11-24 09:12:31.810251873 +0000 UTC m=+529.069229321" watchObservedRunningTime="2025-11-24 09:12:31.812605502 +0000 UTC m=+529.071582949" Nov 24 09:12:32 crc kubenswrapper[4563]: I1124 09:12:32.804672 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-bpkp8" event={"ID":"271399d1-6304-4dcd-a3df-6c543849329e","Type":"ContainerStarted","Data":"837ca59e9558e82728294e50ada7f2972517491ec545d5e6f0c3caacd50aa6f5"} Nov 24 09:12:32 crc kubenswrapper[4563]: I1124 09:12:32.805935 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-p6b8w" event={"ID":"a32e3f9f-14d2-44fb-ba5a-9ede6e568643","Type":"ContainerStarted","Data":"aa4f6cbb735988bd8fed3ee98a1739240da401e86a9d481316704d7bd5c87819"} Nov 24 09:12:32 crc kubenswrapper[4563]: I1124 09:12:32.806054 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-p6b8w" Nov 24 09:12:32 crc kubenswrapper[4563]: I1124 09:12:32.818131 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-bpkp8" podStartSLOduration=2.172131838 podStartE2EDuration="4.818117556s" podCreationTimestamp="2025-11-24 09:12:28 +0000 UTC" firstStartedPulling="2025-11-24 09:12:29.421097728 +0000 UTC m=+526.680075175" lastFinishedPulling="2025-11-24 09:12:32.067083446 +0000 UTC m=+529.326060893" observedRunningTime="2025-11-24 09:12:32.814892794 +0000 UTC m=+530.073870241" watchObservedRunningTime="2025-11-24 09:12:32.818117556 +0000 UTC m=+530.077095003" Nov 24 09:12:32 crc kubenswrapper[4563]: I1124 09:12:32.832389 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-p6b8w" podStartSLOduration=2.186327967 podStartE2EDuration="4.832356246s" podCreationTimestamp="2025-11-24 09:12:28 +0000 UTC" firstStartedPulling="2025-11-24 09:12:29.422304483 +0000 UTC m=+526.681281930" lastFinishedPulling="2025-11-24 09:12:32.068332762 +0000 UTC m=+529.327310209" observedRunningTime="2025-11-24 09:12:32.828393905 +0000 UTC m=+530.087371352" watchObservedRunningTime="2025-11-24 09:12:32.832356246 +0000 UTC m=+530.091333694" Nov 24 09:12:39 crc kubenswrapper[4563]: I1124 09:12:39.029589 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-p6b8w" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.263863 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vgbgr"] Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.264275 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovn-controller" containerID="cri-o://69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d" gracePeriod=30 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.264357 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="nbdb" containerID="cri-o://b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3" gracePeriod=30 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.264433 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovn-acl-logging" containerID="cri-o://3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf" gracePeriod=30 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.264406 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="kube-rbac-proxy-node" containerID="cri-o://c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96" gracePeriod=30 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.264406 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8" gracePeriod=30 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.264375 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="northd" containerID="cri-o://0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0" gracePeriod=30 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.264575 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="sbdb" containerID="cri-o://2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b" gracePeriod=30 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.308521 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" containerID="cri-o://02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f" gracePeriod=30 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.560817 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/3.log" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.565297 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovn-acl-logging/0.log" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.565845 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovn-controller/0.log" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.566497 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617254 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mlpdb"] Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.617507 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617528 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.617538 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="sbdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617544 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="sbdb" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.617553 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617559 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.617567 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617573 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.617580 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="nbdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617587 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="nbdb" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.617599 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovn-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617605 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovn-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.617611 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovn-acl-logging" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617616 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovn-acl-logging" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.617625 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="kube-rbac-proxy-node" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617631 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="kube-rbac-proxy-node" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.617651 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="kube-rbac-proxy-ovn-metrics" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617658 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="kube-rbac-proxy-ovn-metrics" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.617670 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="kubecfg-setup" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617677 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="kubecfg-setup" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.617684 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="northd" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617697 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="northd" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617800 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617809 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="kube-rbac-proxy-node" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617817 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="kube-rbac-proxy-ovn-metrics" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617823 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617831 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="nbdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617838 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="sbdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617846 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovn-acl-logging" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617853 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617861 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovn-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617868 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617874 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617881 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="northd" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.617967 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.617975 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.617999 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.618008 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerName="ovnkube-controller" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.619684 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.669850 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-cni-bin\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.669901 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-etc-openvswitch\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.669930 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-node-log\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.669992 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-var-lib-openvswitch\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.669977 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670021 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670089 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-node-log" (OuterVolumeSpecName: "node-log") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670107 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670137 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-systemd-units\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670171 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cee9b713-10b0-49a5-841d-fbb083faba9a-ovn-node-metrics-cert\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670190 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-ovnkube-config\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670223 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670233 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-openvswitch\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670267 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-run-netns\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670320 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670345 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-ovnkube-script-lib\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670342 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670373 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-kubelet\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670412 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670431 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-ovn\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670484 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670483 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670528 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-systemd\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670575 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-log-socket\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670549 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670605 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d62m\" (UniqueName: \"kubernetes.io/projected/cee9b713-10b0-49a5-841d-fbb083faba9a-kube-api-access-5d62m\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670653 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-log-socket" (OuterVolumeSpecName: "log-socket") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670772 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671112 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-slash" (OuterVolumeSpecName: "host-slash") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671086 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-slash\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.670819 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671182 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-env-overrides\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671222 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-cni-netd\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671240 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-run-ovn-kubernetes\") pod \"cee9b713-10b0-49a5-841d-fbb083faba9a\" (UID: \"cee9b713-10b0-49a5-841d-fbb083faba9a\") " Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671300 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671385 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671510 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671690 4563 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671706 4563 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671716 4563 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671724 4563 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671732 4563 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-node-log\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671739 4563 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671747 4563 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671754 4563 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671761 4563 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671768 4563 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671777 4563 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671785 4563 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671793 4563 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671801 4563 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671809 4563 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-log-socket\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671816 4563 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-host-slash\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.671823 4563 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cee9b713-10b0-49a5-841d-fbb083faba9a-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.675675 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cee9b713-10b0-49a5-841d-fbb083faba9a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.676362 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee9b713-10b0-49a5-841d-fbb083faba9a-kube-api-access-5d62m" (OuterVolumeSpecName: "kube-api-access-5d62m") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "kube-api-access-5d62m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.682938 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "cee9b713-10b0-49a5-841d-fbb083faba9a" (UID: "cee9b713-10b0-49a5-841d-fbb083faba9a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.773702 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-run-netns\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.773781 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8m48\" (UniqueName: \"kubernetes.io/projected/ddbb16f7-7589-449e-81de-aa376f3edf00-kube-api-access-f8m48\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.773819 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-run-openvswitch\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.773861 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-kubelet\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.773895 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-log-socket\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.773997 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-slash\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774019 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ddbb16f7-7589-449e-81de-aa376f3edf00-ovnkube-script-lib\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774048 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ddbb16f7-7589-449e-81de-aa376f3edf00-ovn-node-metrics-cert\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774087 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ddbb16f7-7589-449e-81de-aa376f3edf00-ovnkube-config\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774122 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-cni-bin\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774149 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-run-ovn\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774173 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-cni-netd\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774212 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-run-ovn-kubernetes\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774258 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-etc-openvswitch\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774277 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-var-lib-openvswitch\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774308 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774351 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-node-log\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774394 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-run-systemd\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774441 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-systemd-units\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774470 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ddbb16f7-7589-449e-81de-aa376f3edf00-env-overrides\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774547 4563 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cee9b713-10b0-49a5-841d-fbb083faba9a-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774568 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d62m\" (UniqueName: \"kubernetes.io/projected/cee9b713-10b0-49a5-841d-fbb083faba9a-kube-api-access-5d62m\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.774583 4563 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cee9b713-10b0-49a5-841d-fbb083faba9a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.843079 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovnkube-controller/3.log" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.844838 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovn-acl-logging/0.log" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845232 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vgbgr_cee9b713-10b0-49a5-841d-fbb083faba9a/ovn-controller/0.log" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845564 4563 generic.go:334] "Generic (PLEG): container finished" podID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerID="02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f" exitCode=0 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845591 4563 generic.go:334] "Generic (PLEG): container finished" podID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerID="2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b" exitCode=0 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845598 4563 generic.go:334] "Generic (PLEG): container finished" podID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerID="b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3" exitCode=0 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845605 4563 generic.go:334] "Generic (PLEG): container finished" podID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerID="0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0" exitCode=0 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845611 4563 generic.go:334] "Generic (PLEG): container finished" podID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerID="39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8" exitCode=0 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845619 4563 generic.go:334] "Generic (PLEG): container finished" podID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerID="c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96" exitCode=0 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845625 4563 generic.go:334] "Generic (PLEG): container finished" podID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerID="3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf" exitCode=143 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845631 4563 generic.go:334] "Generic (PLEG): container finished" podID="cee9b713-10b0-49a5-841d-fbb083faba9a" containerID="69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d" exitCode=143 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845658 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845610 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerDied","Data":"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845712 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerDied","Data":"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845730 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerDied","Data":"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845747 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerDied","Data":"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845757 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerDied","Data":"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845767 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerDied","Data":"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845773 4563 scope.go:117] "RemoveContainer" containerID="02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845782 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845794 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845799 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845806 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845812 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845819 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845823 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845828 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845832 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845839 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerDied","Data":"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845846 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845852 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845856 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845861 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845866 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845871 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845876 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845880 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845885 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845890 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845897 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerDied","Data":"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845904 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845910 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845915 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845919 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845924 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845928 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845933 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845937 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845942 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845946 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845952 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vgbgr" event={"ID":"cee9b713-10b0-49a5-841d-fbb083faba9a","Type":"ContainerDied","Data":"8f586bd3089d03400821f573ab0381f0bcba9faa5e1d46557ce28d5e286c3f3d"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845958 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845964 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845969 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845975 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845991 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.845997 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.846002 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.846008 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.846013 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.846019 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.847459 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nw8xd_019bd805-9123-494a-bb29-f39b924e6243/kube-multus/2.log" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.847962 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nw8xd_019bd805-9123-494a-bb29-f39b924e6243/kube-multus/1.log" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.848012 4563 generic.go:334] "Generic (PLEG): container finished" podID="019bd805-9123-494a-bb29-f39b924e6243" containerID="eb2ac8e61357886c955d8ea2e45d3e7697fed103f3408d6c13b3011e6f152b1c" exitCode=2 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.848041 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nw8xd" event={"ID":"019bd805-9123-494a-bb29-f39b924e6243","Type":"ContainerDied","Data":"eb2ac8e61357886c955d8ea2e45d3e7697fed103f3408d6c13b3011e6f152b1c"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.848065 4563 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"381c5f62c655111b7df341bae96a5edef6bcd2d5c3a8758d07465c278445bb8a"} Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.848511 4563 scope.go:117] "RemoveContainer" containerID="eb2ac8e61357886c955d8ea2e45d3e7697fed103f3408d6c13b3011e6f152b1c" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.848816 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-nw8xd_openshift-multus(019bd805-9123-494a-bb29-f39b924e6243)\"" pod="openshift-multus/multus-nw8xd" podUID="019bd805-9123-494a-bb29-f39b924e6243" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.862382 4563 scope.go:117] "RemoveContainer" containerID="d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.872974 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vgbgr"] Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875392 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-log-socket\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875441 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-slash\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875463 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ddbb16f7-7589-449e-81de-aa376f3edf00-ovnkube-script-lib\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875483 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ddbb16f7-7589-449e-81de-aa376f3edf00-ovn-node-metrics-cert\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875508 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ddbb16f7-7589-449e-81de-aa376f3edf00-ovnkube-config\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875539 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-cni-bin\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875542 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-slash\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875556 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-run-ovn\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875582 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-log-socket\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875600 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-cni-netd\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875629 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-run-ovn-kubernetes\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875669 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-etc-openvswitch\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875686 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-var-lib-openvswitch\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875707 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875730 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-node-log\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875755 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-run-systemd\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875767 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-run-ovn-kubernetes\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875784 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-systemd-units\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875801 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-cni-bin\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875805 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ddbb16f7-7589-449e-81de-aa376f3edf00-env-overrides\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875858 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-run-netns\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875879 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8m48\" (UniqueName: \"kubernetes.io/projected/ddbb16f7-7589-449e-81de-aa376f3edf00-kube-api-access-f8m48\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875898 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-run-openvswitch\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.875923 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-kubelet\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.876008 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-kubelet\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.876029 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-run-ovn\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.876050 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-cni-netd\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.876069 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-run-netns\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.876144 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ddbb16f7-7589-449e-81de-aa376f3edf00-ovnkube-script-lib\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.876201 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.876231 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-etc-openvswitch\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.876254 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-var-lib-openvswitch\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.876273 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-run-openvswitch\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.876303 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-run-systemd\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.876325 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-node-log\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.876348 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ddbb16f7-7589-449e-81de-aa376f3edf00-systemd-units\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.876847 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ddbb16f7-7589-449e-81de-aa376f3edf00-ovnkube-config\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.876935 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ddbb16f7-7589-449e-81de-aa376f3edf00-env-overrides\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.879592 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ddbb16f7-7589-449e-81de-aa376f3edf00-ovn-node-metrics-cert\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.881575 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vgbgr"] Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.886671 4563 scope.go:117] "RemoveContainer" containerID="2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.891883 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8m48\" (UniqueName: \"kubernetes.io/projected/ddbb16f7-7589-449e-81de-aa376f3edf00-kube-api-access-f8m48\") pod \"ovnkube-node-mlpdb\" (UID: \"ddbb16f7-7589-449e-81de-aa376f3edf00\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.897747 4563 scope.go:117] "RemoveContainer" containerID="b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.908015 4563 scope.go:117] "RemoveContainer" containerID="0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.921116 4563 scope.go:117] "RemoveContainer" containerID="39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.930813 4563 scope.go:117] "RemoveContainer" containerID="c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.931082 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.941761 4563 scope.go:117] "RemoveContainer" containerID="3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf" Nov 24 09:12:40 crc kubenswrapper[4563]: W1124 09:12:40.947991 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddbb16f7_7589_449e_81de_aa376f3edf00.slice/crio-5362a6d6fc978cdabb31798133cd6ba1c1af14f62391253d058322899f9ae753 WatchSource:0}: Error finding container 5362a6d6fc978cdabb31798133cd6ba1c1af14f62391253d058322899f9ae753: Status 404 returned error can't find the container with id 5362a6d6fc978cdabb31798133cd6ba1c1af14f62391253d058322899f9ae753 Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.951575 4563 scope.go:117] "RemoveContainer" containerID="69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.968217 4563 scope.go:117] "RemoveContainer" containerID="5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.977880 4563 scope.go:117] "RemoveContainer" containerID="02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.978482 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f\": container with ID starting with 02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f not found: ID does not exist" containerID="02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.978536 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f"} err="failed to get container status \"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f\": rpc error: code = NotFound desc = could not find container \"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f\": container with ID starting with 02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.978583 4563 scope.go:117] "RemoveContainer" containerID="d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.978866 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\": container with ID starting with d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772 not found: ID does not exist" containerID="d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.978954 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772"} err="failed to get container status \"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\": rpc error: code = NotFound desc = could not find container \"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\": container with ID starting with d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.979046 4563 scope.go:117] "RemoveContainer" containerID="2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.979388 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\": container with ID starting with 2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b not found: ID does not exist" containerID="2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.979458 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b"} err="failed to get container status \"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\": rpc error: code = NotFound desc = could not find container \"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\": container with ID starting with 2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.979524 4563 scope.go:117] "RemoveContainer" containerID="b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.980181 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\": container with ID starting with b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3 not found: ID does not exist" containerID="b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.980218 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3"} err="failed to get container status \"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\": rpc error: code = NotFound desc = could not find container \"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\": container with ID starting with b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.980243 4563 scope.go:117] "RemoveContainer" containerID="0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.980889 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\": container with ID starting with 0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0 not found: ID does not exist" containerID="0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.980931 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0"} err="failed to get container status \"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\": rpc error: code = NotFound desc = could not find container \"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\": container with ID starting with 0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.980953 4563 scope.go:117] "RemoveContainer" containerID="39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.981244 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\": container with ID starting with 39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8 not found: ID does not exist" containerID="39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.981267 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8"} err="failed to get container status \"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\": rpc error: code = NotFound desc = could not find container \"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\": container with ID starting with 39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.981279 4563 scope.go:117] "RemoveContainer" containerID="c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.981539 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\": container with ID starting with c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96 not found: ID does not exist" containerID="c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.981577 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96"} err="failed to get container status \"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\": rpc error: code = NotFound desc = could not find container \"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\": container with ID starting with c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.981604 4563 scope.go:117] "RemoveContainer" containerID="3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.981936 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\": container with ID starting with 3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf not found: ID does not exist" containerID="3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.981968 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf"} err="failed to get container status \"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\": rpc error: code = NotFound desc = could not find container \"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\": container with ID starting with 3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.981999 4563 scope.go:117] "RemoveContainer" containerID="69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.982247 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\": container with ID starting with 69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d not found: ID does not exist" containerID="69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.982274 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d"} err="failed to get container status \"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\": rpc error: code = NotFound desc = could not find container \"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\": container with ID starting with 69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.982291 4563 scope.go:117] "RemoveContainer" containerID="5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2" Nov 24 09:12:40 crc kubenswrapper[4563]: E1124 09:12:40.982531 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\": container with ID starting with 5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2 not found: ID does not exist" containerID="5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.982553 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2"} err="failed to get container status \"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\": rpc error: code = NotFound desc = could not find container \"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\": container with ID starting with 5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.982572 4563 scope.go:117] "RemoveContainer" containerID="02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.983779 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f"} err="failed to get container status \"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f\": rpc error: code = NotFound desc = could not find container \"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f\": container with ID starting with 02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.983842 4563 scope.go:117] "RemoveContainer" containerID="d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.984289 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772"} err="failed to get container status \"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\": rpc error: code = NotFound desc = could not find container \"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\": container with ID starting with d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.984848 4563 scope.go:117] "RemoveContainer" containerID="2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.986308 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b"} err="failed to get container status \"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\": rpc error: code = NotFound desc = could not find container \"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\": container with ID starting with 2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.986387 4563 scope.go:117] "RemoveContainer" containerID="b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.986797 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3"} err="failed to get container status \"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\": rpc error: code = NotFound desc = could not find container \"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\": container with ID starting with b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.986843 4563 scope.go:117] "RemoveContainer" containerID="0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.987181 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0"} err="failed to get container status \"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\": rpc error: code = NotFound desc = could not find container \"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\": container with ID starting with 0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.987211 4563 scope.go:117] "RemoveContainer" containerID="39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.987450 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8"} err="failed to get container status \"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\": rpc error: code = NotFound desc = could not find container \"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\": container with ID starting with 39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.987478 4563 scope.go:117] "RemoveContainer" containerID="c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.988010 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96"} err="failed to get container status \"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\": rpc error: code = NotFound desc = could not find container \"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\": container with ID starting with c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.988040 4563 scope.go:117] "RemoveContainer" containerID="3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.988334 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf"} err="failed to get container status \"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\": rpc error: code = NotFound desc = could not find container \"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\": container with ID starting with 3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.988358 4563 scope.go:117] "RemoveContainer" containerID="69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.988721 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d"} err="failed to get container status \"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\": rpc error: code = NotFound desc = could not find container \"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\": container with ID starting with 69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.988752 4563 scope.go:117] "RemoveContainer" containerID="5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.989246 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2"} err="failed to get container status \"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\": rpc error: code = NotFound desc = could not find container \"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\": container with ID starting with 5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.989269 4563 scope.go:117] "RemoveContainer" containerID="02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.989611 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f"} err="failed to get container status \"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f\": rpc error: code = NotFound desc = could not find container \"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f\": container with ID starting with 02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.989659 4563 scope.go:117] "RemoveContainer" containerID="d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.990009 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772"} err="failed to get container status \"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\": rpc error: code = NotFound desc = could not find container \"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\": container with ID starting with d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.990037 4563 scope.go:117] "RemoveContainer" containerID="2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.990380 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b"} err="failed to get container status \"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\": rpc error: code = NotFound desc = could not find container \"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\": container with ID starting with 2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.990405 4563 scope.go:117] "RemoveContainer" containerID="b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.990730 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3"} err="failed to get container status \"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\": rpc error: code = NotFound desc = could not find container \"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\": container with ID starting with b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.990759 4563 scope.go:117] "RemoveContainer" containerID="0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.991061 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0"} err="failed to get container status \"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\": rpc error: code = NotFound desc = could not find container \"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\": container with ID starting with 0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.991087 4563 scope.go:117] "RemoveContainer" containerID="39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.991398 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8"} err="failed to get container status \"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\": rpc error: code = NotFound desc = could not find container \"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\": container with ID starting with 39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.991425 4563 scope.go:117] "RemoveContainer" containerID="c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.991748 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96"} err="failed to get container status \"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\": rpc error: code = NotFound desc = could not find container \"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\": container with ID starting with c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.991770 4563 scope.go:117] "RemoveContainer" containerID="3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.992139 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf"} err="failed to get container status \"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\": rpc error: code = NotFound desc = could not find container \"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\": container with ID starting with 3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.992169 4563 scope.go:117] "RemoveContainer" containerID="69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.992677 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d"} err="failed to get container status \"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\": rpc error: code = NotFound desc = could not find container \"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\": container with ID starting with 69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.992703 4563 scope.go:117] "RemoveContainer" containerID="5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.993018 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2"} err="failed to get container status \"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\": rpc error: code = NotFound desc = could not find container \"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\": container with ID starting with 5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.993051 4563 scope.go:117] "RemoveContainer" containerID="02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.993382 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f"} err="failed to get container status \"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f\": rpc error: code = NotFound desc = could not find container \"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f\": container with ID starting with 02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.993409 4563 scope.go:117] "RemoveContainer" containerID="d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.993743 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772"} err="failed to get container status \"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\": rpc error: code = NotFound desc = could not find container \"d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772\": container with ID starting with d8ea78aa34a63013b434ff1210dfacc0eb22d25df49e0fda3ce2041244621772 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.993777 4563 scope.go:117] "RemoveContainer" containerID="2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.994108 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b"} err="failed to get container status \"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\": rpc error: code = NotFound desc = could not find container \"2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b\": container with ID starting with 2b79c17e10e88db36db382537ad00d002b2a0c3fd7aecbe281e90fb90ec70f6b not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.994129 4563 scope.go:117] "RemoveContainer" containerID="b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.994416 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3"} err="failed to get container status \"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\": rpc error: code = NotFound desc = could not find container \"b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3\": container with ID starting with b991991c0cb0a17a3a5c2f14977ab1212029d9a84d42acc21b2e2304b7b8d9f3 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.994438 4563 scope.go:117] "RemoveContainer" containerID="0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.994696 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0"} err="failed to get container status \"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\": rpc error: code = NotFound desc = could not find container \"0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0\": container with ID starting with 0744759471e5bc1db14a5ee207c8a68cdfec7bdb54ebf0f9f2b655dc6b280cb0 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.994718 4563 scope.go:117] "RemoveContainer" containerID="39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.995005 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8"} err="failed to get container status \"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\": rpc error: code = NotFound desc = could not find container \"39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8\": container with ID starting with 39ba0ad3be39393042dae04c91db62349cbcc3fa61a3768336608db0186b00e8 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.995036 4563 scope.go:117] "RemoveContainer" containerID="c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.995339 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96"} err="failed to get container status \"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\": rpc error: code = NotFound desc = could not find container \"c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96\": container with ID starting with c9a88abb398281e7d9ac6717139ae7c6b6ab5a8518f82b7bd907be1fcae1bc96 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.995365 4563 scope.go:117] "RemoveContainer" containerID="3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.995718 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf"} err="failed to get container status \"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\": rpc error: code = NotFound desc = could not find container \"3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf\": container with ID starting with 3ca7203cfb02d49b3b1d3f7652da76da90bc8013813eaa6179572ac8c22cdbdf not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.995743 4563 scope.go:117] "RemoveContainer" containerID="69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.996057 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d"} err="failed to get container status \"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\": rpc error: code = NotFound desc = could not find container \"69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d\": container with ID starting with 69e6366b733b291b774ed72a2e8a66159992aef27b2aaa41e8ae92c5fdc8be9d not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.996078 4563 scope.go:117] "RemoveContainer" containerID="5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.996389 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2"} err="failed to get container status \"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\": rpc error: code = NotFound desc = could not find container \"5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2\": container with ID starting with 5f67738fdf789660564d3b4757ec91493532fa710efb2547de4d020eacee46c2 not found: ID does not exist" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.996410 4563 scope.go:117] "RemoveContainer" containerID="02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f" Nov 24 09:12:40 crc kubenswrapper[4563]: I1124 09:12:40.996691 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f"} err="failed to get container status \"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f\": rpc error: code = NotFound desc = could not find container \"02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f\": container with ID starting with 02d68daa32878c6aeaee3a313f0d94baf2e72e2e85a808d81ef660097ff4478f not found: ID does not exist" Nov 24 09:12:41 crc kubenswrapper[4563]: I1124 09:12:41.061941 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cee9b713-10b0-49a5-841d-fbb083faba9a" path="/var/lib/kubelet/pods/cee9b713-10b0-49a5-841d-fbb083faba9a/volumes" Nov 24 09:12:41 crc kubenswrapper[4563]: I1124 09:12:41.855537 4563 generic.go:334] "Generic (PLEG): container finished" podID="ddbb16f7-7589-449e-81de-aa376f3edf00" containerID="8105cbccb3a581288f572691aa15c26b1cc10c097ac6038233025fb64b99a3de" exitCode=0 Nov 24 09:12:41 crc kubenswrapper[4563]: I1124 09:12:41.855618 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" event={"ID":"ddbb16f7-7589-449e-81de-aa376f3edf00","Type":"ContainerDied","Data":"8105cbccb3a581288f572691aa15c26b1cc10c097ac6038233025fb64b99a3de"} Nov 24 09:12:41 crc kubenswrapper[4563]: I1124 09:12:41.855692 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" event={"ID":"ddbb16f7-7589-449e-81de-aa376f3edf00","Type":"ContainerStarted","Data":"5362a6d6fc978cdabb31798133cd6ba1c1af14f62391253d058322899f9ae753"} Nov 24 09:12:42 crc kubenswrapper[4563]: I1124 09:12:42.864880 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" event={"ID":"ddbb16f7-7589-449e-81de-aa376f3edf00","Type":"ContainerStarted","Data":"05effffebaddc2f84154633c727346bfd5ca13e5d0aab6c2a203dfa5dba5eba5"} Nov 24 09:12:42 crc kubenswrapper[4563]: I1124 09:12:42.865219 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" event={"ID":"ddbb16f7-7589-449e-81de-aa376f3edf00","Type":"ContainerStarted","Data":"3b7765e1cd29971672bc2c6f83dc65c29f5e3cd405a2abdf0b3ec934141e7ad0"} Nov 24 09:12:42 crc kubenswrapper[4563]: I1124 09:12:42.865230 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" event={"ID":"ddbb16f7-7589-449e-81de-aa376f3edf00","Type":"ContainerStarted","Data":"127bb97c37f83018aa570b209efd00fd06c694c1a660ee2353d019a1ea49db6c"} Nov 24 09:12:42 crc kubenswrapper[4563]: I1124 09:12:42.865243 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" event={"ID":"ddbb16f7-7589-449e-81de-aa376f3edf00","Type":"ContainerStarted","Data":"02e5d75f02f2cb5ca6e5b0737dc2b126ed0cb74d4c6b5398ee6d699e9237e037"} Nov 24 09:12:42 crc kubenswrapper[4563]: I1124 09:12:42.865251 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" event={"ID":"ddbb16f7-7589-449e-81de-aa376f3edf00","Type":"ContainerStarted","Data":"b867f67aaa7c1ad401f264c474cd661280297ef980e7486be26c03d5c5867781"} Nov 24 09:12:42 crc kubenswrapper[4563]: I1124 09:12:42.865259 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" event={"ID":"ddbb16f7-7589-449e-81de-aa376f3edf00","Type":"ContainerStarted","Data":"9ef041c01b1b22c05ba2eda0f0af04de33276f9b023a92fec6bbd0db5105fae2"} Nov 24 09:12:43 crc kubenswrapper[4563]: I1124 09:12:43.151114 4563 scope.go:117] "RemoveContainer" containerID="381c5f62c655111b7df341bae96a5edef6bcd2d5c3a8758d07465c278445bb8a" Nov 24 09:12:43 crc kubenswrapper[4563]: I1124 09:12:43.870887 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nw8xd_019bd805-9123-494a-bb29-f39b924e6243/kube-multus/2.log" Nov 24 09:12:44 crc kubenswrapper[4563]: I1124 09:12:44.881176 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" event={"ID":"ddbb16f7-7589-449e-81de-aa376f3edf00","Type":"ContainerStarted","Data":"b39af4eb8269c35aa92851a8bfc953af90c81b35d3ff67437a040a8d1bbf4887"} Nov 24 09:12:46 crc kubenswrapper[4563]: I1124 09:12:46.894620 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" event={"ID":"ddbb16f7-7589-449e-81de-aa376f3edf00","Type":"ContainerStarted","Data":"9412112d0f49919e81f8c0aa93f7fbd6ef27fcf0b8478687d60d1d63bf01832c"} Nov 24 09:12:46 crc kubenswrapper[4563]: I1124 09:12:46.895435 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:46 crc kubenswrapper[4563]: I1124 09:12:46.919801 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" podStartSLOduration=6.919780252 podStartE2EDuration="6.919780252s" podCreationTimestamp="2025-11-24 09:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:12:46.918522561 +0000 UTC m=+544.177500008" watchObservedRunningTime="2025-11-24 09:12:46.919780252 +0000 UTC m=+544.178757699" Nov 24 09:12:46 crc kubenswrapper[4563]: I1124 09:12:46.921526 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:47 crc kubenswrapper[4563]: I1124 09:12:47.899764 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:47 crc kubenswrapper[4563]: I1124 09:12:47.900051 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:47 crc kubenswrapper[4563]: I1124 09:12:47.930245 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:12:52 crc kubenswrapper[4563]: I1124 09:12:52.054413 4563 scope.go:117] "RemoveContainer" containerID="eb2ac8e61357886c955d8ea2e45d3e7697fed103f3408d6c13b3011e6f152b1c" Nov 24 09:12:52 crc kubenswrapper[4563]: E1124 09:12:52.054979 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-nw8xd_openshift-multus(019bd805-9123-494a-bb29-f39b924e6243)\"" pod="openshift-multus/multus-nw8xd" podUID="019bd805-9123-494a-bb29-f39b924e6243" Nov 24 09:13:04 crc kubenswrapper[4563]: I1124 09:13:04.054789 4563 scope.go:117] "RemoveContainer" containerID="eb2ac8e61357886c955d8ea2e45d3e7697fed103f3408d6c13b3011e6f152b1c" Nov 24 09:13:04 crc kubenswrapper[4563]: I1124 09:13:04.986967 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nw8xd_019bd805-9123-494a-bb29-f39b924e6243/kube-multus/2.log" Nov 24 09:13:04 crc kubenswrapper[4563]: I1124 09:13:04.987308 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nw8xd" event={"ID":"019bd805-9123-494a-bb29-f39b924e6243","Type":"ContainerStarted","Data":"64388944445c5c81ed402684504ab3f2587e654ca34332bcf66b2a9a89629e0c"} Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.701024 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc"] Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.702341 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.704177 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.711246 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc"] Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.726126 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gj4n\" (UniqueName: \"kubernetes.io/projected/e25e7e42-b065-4947-a1d0-3d641b371d06-kube-api-access-8gj4n\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc\" (UID: \"e25e7e42-b065-4947-a1d0-3d641b371d06\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.726170 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e25e7e42-b065-4947-a1d0-3d641b371d06-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc\" (UID: \"e25e7e42-b065-4947-a1d0-3d641b371d06\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.726229 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e25e7e42-b065-4947-a1d0-3d641b371d06-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc\" (UID: \"e25e7e42-b065-4947-a1d0-3d641b371d06\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.827092 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e25e7e42-b065-4947-a1d0-3d641b371d06-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc\" (UID: \"e25e7e42-b065-4947-a1d0-3d641b371d06\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.827178 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gj4n\" (UniqueName: \"kubernetes.io/projected/e25e7e42-b065-4947-a1d0-3d641b371d06-kube-api-access-8gj4n\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc\" (UID: \"e25e7e42-b065-4947-a1d0-3d641b371d06\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.827207 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e25e7e42-b065-4947-a1d0-3d641b371d06-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc\" (UID: \"e25e7e42-b065-4947-a1d0-3d641b371d06\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.827664 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e25e7e42-b065-4947-a1d0-3d641b371d06-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc\" (UID: \"e25e7e42-b065-4947-a1d0-3d641b371d06\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.827665 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e25e7e42-b065-4947-a1d0-3d641b371d06-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc\" (UID: \"e25e7e42-b065-4947-a1d0-3d641b371d06\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.844476 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gj4n\" (UniqueName: \"kubernetes.io/projected/e25e7e42-b065-4947-a1d0-3d641b371d06-kube-api-access-8gj4n\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc\" (UID: \"e25e7e42-b065-4947-a1d0-3d641b371d06\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.987741 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:13:08 crc kubenswrapper[4563]: I1124 09:13:08.987806 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:13:09 crc kubenswrapper[4563]: I1124 09:13:09.016221 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" Nov 24 09:13:09 crc kubenswrapper[4563]: I1124 09:13:09.358404 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc"] Nov 24 09:13:10 crc kubenswrapper[4563]: I1124 09:13:10.011343 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" event={"ID":"e25e7e42-b065-4947-a1d0-3d641b371d06","Type":"ContainerStarted","Data":"1f8c6bc7d1f51eca6e0714e2d8e35eb7deac8cd9788f6abea215245419a55af6"} Nov 24 09:13:10 crc kubenswrapper[4563]: I1124 09:13:10.011744 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" event={"ID":"e25e7e42-b065-4947-a1d0-3d641b371d06","Type":"ContainerStarted","Data":"a2636afd84f332feb844d0eedce9f94904d3f0031bc771c820f71a8de7f31d1f"} Nov 24 09:13:10 crc kubenswrapper[4563]: I1124 09:13:10.951993 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mlpdb" Nov 24 09:13:11 crc kubenswrapper[4563]: I1124 09:13:11.017283 4563 generic.go:334] "Generic (PLEG): container finished" podID="e25e7e42-b065-4947-a1d0-3d641b371d06" containerID="1f8c6bc7d1f51eca6e0714e2d8e35eb7deac8cd9788f6abea215245419a55af6" exitCode=0 Nov 24 09:13:11 crc kubenswrapper[4563]: I1124 09:13:11.017362 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" event={"ID":"e25e7e42-b065-4947-a1d0-3d641b371d06","Type":"ContainerDied","Data":"1f8c6bc7d1f51eca6e0714e2d8e35eb7deac8cd9788f6abea215245419a55af6"} Nov 24 09:13:13 crc kubenswrapper[4563]: I1124 09:13:13.027127 4563 generic.go:334] "Generic (PLEG): container finished" podID="e25e7e42-b065-4947-a1d0-3d641b371d06" containerID="2d8a2ce942762d77ab6e3dcefef7df8185bb2b94baa4314088254b8738c720a2" exitCode=0 Nov 24 09:13:13 crc kubenswrapper[4563]: I1124 09:13:13.027230 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" event={"ID":"e25e7e42-b065-4947-a1d0-3d641b371d06","Type":"ContainerDied","Data":"2d8a2ce942762d77ab6e3dcefef7df8185bb2b94baa4314088254b8738c720a2"} Nov 24 09:13:14 crc kubenswrapper[4563]: I1124 09:13:14.034370 4563 generic.go:334] "Generic (PLEG): container finished" podID="e25e7e42-b065-4947-a1d0-3d641b371d06" containerID="3fd2233b742e5b1413b6f5eacfe73bdc2d91a8301964fd91e3b40c3367b21223" exitCode=0 Nov 24 09:13:14 crc kubenswrapper[4563]: I1124 09:13:14.034419 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" event={"ID":"e25e7e42-b065-4947-a1d0-3d641b371d06","Type":"ContainerDied","Data":"3fd2233b742e5b1413b6f5eacfe73bdc2d91a8301964fd91e3b40c3367b21223"} Nov 24 09:13:15 crc kubenswrapper[4563]: I1124 09:13:15.203452 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" Nov 24 09:13:15 crc kubenswrapper[4563]: I1124 09:13:15.402125 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e25e7e42-b065-4947-a1d0-3d641b371d06-util\") pod \"e25e7e42-b065-4947-a1d0-3d641b371d06\" (UID: \"e25e7e42-b065-4947-a1d0-3d641b371d06\") " Nov 24 09:13:15 crc kubenswrapper[4563]: I1124 09:13:15.402218 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e25e7e42-b065-4947-a1d0-3d641b371d06-bundle\") pod \"e25e7e42-b065-4947-a1d0-3d641b371d06\" (UID: \"e25e7e42-b065-4947-a1d0-3d641b371d06\") " Nov 24 09:13:15 crc kubenswrapper[4563]: I1124 09:13:15.402306 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gj4n\" (UniqueName: \"kubernetes.io/projected/e25e7e42-b065-4947-a1d0-3d641b371d06-kube-api-access-8gj4n\") pod \"e25e7e42-b065-4947-a1d0-3d641b371d06\" (UID: \"e25e7e42-b065-4947-a1d0-3d641b371d06\") " Nov 24 09:13:15 crc kubenswrapper[4563]: I1124 09:13:15.402780 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e25e7e42-b065-4947-a1d0-3d641b371d06-bundle" (OuterVolumeSpecName: "bundle") pod "e25e7e42-b065-4947-a1d0-3d641b371d06" (UID: "e25e7e42-b065-4947-a1d0-3d641b371d06"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:13:15 crc kubenswrapper[4563]: I1124 09:13:15.407542 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e25e7e42-b065-4947-a1d0-3d641b371d06-kube-api-access-8gj4n" (OuterVolumeSpecName: "kube-api-access-8gj4n") pod "e25e7e42-b065-4947-a1d0-3d641b371d06" (UID: "e25e7e42-b065-4947-a1d0-3d641b371d06"). InnerVolumeSpecName "kube-api-access-8gj4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:13:15 crc kubenswrapper[4563]: I1124 09:13:15.409524 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e25e7e42-b065-4947-a1d0-3d641b371d06-util" (OuterVolumeSpecName: "util") pod "e25e7e42-b065-4947-a1d0-3d641b371d06" (UID: "e25e7e42-b065-4947-a1d0-3d641b371d06"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:13:15 crc kubenswrapper[4563]: I1124 09:13:15.503790 4563 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e25e7e42-b065-4947-a1d0-3d641b371d06-util\") on node \"crc\" DevicePath \"\"" Nov 24 09:13:15 crc kubenswrapper[4563]: I1124 09:13:15.503824 4563 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e25e7e42-b065-4947-a1d0-3d641b371d06-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:13:15 crc kubenswrapper[4563]: I1124 09:13:15.503835 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gj4n\" (UniqueName: \"kubernetes.io/projected/e25e7e42-b065-4947-a1d0-3d641b371d06-kube-api-access-8gj4n\") on node \"crc\" DevicePath \"\"" Nov 24 09:13:16 crc kubenswrapper[4563]: I1124 09:13:16.042202 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" event={"ID":"e25e7e42-b065-4947-a1d0-3d641b371d06","Type":"ContainerDied","Data":"a2636afd84f332feb844d0eedce9f94904d3f0031bc771c820f71a8de7f31d1f"} Nov 24 09:13:16 crc kubenswrapper[4563]: I1124 09:13:16.042237 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2636afd84f332feb844d0eedce9f94904d3f0031bc771c820f71a8de7f31d1f" Nov 24 09:13:16 crc kubenswrapper[4563]: I1124 09:13:16.042253 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc" Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.178741 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-ppw2s"] Nov 24 09:13:20 crc kubenswrapper[4563]: E1124 09:13:20.179122 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e25e7e42-b065-4947-a1d0-3d641b371d06" containerName="pull" Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.179134 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="e25e7e42-b065-4947-a1d0-3d641b371d06" containerName="pull" Nov 24 09:13:20 crc kubenswrapper[4563]: E1124 09:13:20.179141 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e25e7e42-b065-4947-a1d0-3d641b371d06" containerName="extract" Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.179146 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="e25e7e42-b065-4947-a1d0-3d641b371d06" containerName="extract" Nov 24 09:13:20 crc kubenswrapper[4563]: E1124 09:13:20.179158 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e25e7e42-b065-4947-a1d0-3d641b371d06" containerName="util" Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.179165 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="e25e7e42-b065-4947-a1d0-3d641b371d06" containerName="util" Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.179248 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="e25e7e42-b065-4947-a1d0-3d641b371d06" containerName="extract" Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.179551 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-ppw2s" Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.181204 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-c7hdf" Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.181619 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.181811 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.192593 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-ppw2s"] Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.257254 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-892rp\" (UniqueName: \"kubernetes.io/projected/e4241626-5fdc-4620-9ffd-6bdc19046a33-kube-api-access-892rp\") pod \"nmstate-operator-557fdffb88-ppw2s\" (UID: \"e4241626-5fdc-4620-9ffd-6bdc19046a33\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-ppw2s" Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.358057 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-892rp\" (UniqueName: \"kubernetes.io/projected/e4241626-5fdc-4620-9ffd-6bdc19046a33-kube-api-access-892rp\") pod \"nmstate-operator-557fdffb88-ppw2s\" (UID: \"e4241626-5fdc-4620-9ffd-6bdc19046a33\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-ppw2s" Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.374441 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-892rp\" (UniqueName: \"kubernetes.io/projected/e4241626-5fdc-4620-9ffd-6bdc19046a33-kube-api-access-892rp\") pod \"nmstate-operator-557fdffb88-ppw2s\" (UID: \"e4241626-5fdc-4620-9ffd-6bdc19046a33\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-ppw2s" Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.492829 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-ppw2s" Nov 24 09:13:20 crc kubenswrapper[4563]: I1124 09:13:20.659085 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-ppw2s"] Nov 24 09:13:21 crc kubenswrapper[4563]: I1124 09:13:21.063860 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-ppw2s" event={"ID":"e4241626-5fdc-4620-9ffd-6bdc19046a33","Type":"ContainerStarted","Data":"05971e556939603f6bf0b51e42bfe29427e0150bb96439929f20a1bbff6d8560"} Nov 24 09:13:23 crc kubenswrapper[4563]: I1124 09:13:23.074880 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-ppw2s" event={"ID":"e4241626-5fdc-4620-9ffd-6bdc19046a33","Type":"ContainerStarted","Data":"bcc573f3d563db602480247cc7c7bbce3f58a97b39f0b407c5e9b994070159f9"} Nov 24 09:13:23 crc kubenswrapper[4563]: I1124 09:13:23.085994 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-557fdffb88-ppw2s" podStartSLOduration=1.140440746 podStartE2EDuration="3.085979911s" podCreationTimestamp="2025-11-24 09:13:20 +0000 UTC" firstStartedPulling="2025-11-24 09:13:20.666137663 +0000 UTC m=+577.925115111" lastFinishedPulling="2025-11-24 09:13:22.61167683 +0000 UTC m=+579.870654276" observedRunningTime="2025-11-24 09:13:23.085140287 +0000 UTC m=+580.344117734" watchObservedRunningTime="2025-11-24 09:13:23.085979911 +0000 UTC m=+580.344957357" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.001386 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-xf2lb"] Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.002455 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-xf2lb" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.004343 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-lfbv2" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.009874 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz"] Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.010529 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.013188 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.013563 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-xf2lb"] Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.020595 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz"] Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.037040 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-jbdfx"] Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.037821 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.057782 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/564d8757-3a04-48f3-b3a2-109930f83a10-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-qz8gz\" (UID: \"564d8757-3a04-48f3-b3a2-109930f83a10\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.057990 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d54pb\" (UniqueName: \"kubernetes.io/projected/71fe428e-199c-422c-8911-79d2a7d27ab1-kube-api-access-d54pb\") pod \"nmstate-handler-jbdfx\" (UID: \"71fe428e-199c-422c-8911-79d2a7d27ab1\") " pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.058126 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/71fe428e-199c-422c-8911-79d2a7d27ab1-ovs-socket\") pod \"nmstate-handler-jbdfx\" (UID: \"71fe428e-199c-422c-8911-79d2a7d27ab1\") " pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.058248 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-758jc\" (UniqueName: \"kubernetes.io/projected/d0a1ac8a-df66-4ac5-9aed-a2001c905f21-kube-api-access-758jc\") pod \"nmstate-metrics-5dcf9c57c5-xf2lb\" (UID: \"d0a1ac8a-df66-4ac5-9aed-a2001c905f21\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-xf2lb" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.058812 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntk79\" (UniqueName: \"kubernetes.io/projected/564d8757-3a04-48f3-b3a2-109930f83a10-kube-api-access-ntk79\") pod \"nmstate-webhook-6b89b748d8-qz8gz\" (UID: \"564d8757-3a04-48f3-b3a2-109930f83a10\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.058986 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/71fe428e-199c-422c-8911-79d2a7d27ab1-dbus-socket\") pod \"nmstate-handler-jbdfx\" (UID: \"71fe428e-199c-422c-8911-79d2a7d27ab1\") " pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.059058 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/71fe428e-199c-422c-8911-79d2a7d27ab1-nmstate-lock\") pod \"nmstate-handler-jbdfx\" (UID: \"71fe428e-199c-422c-8911-79d2a7d27ab1\") " pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.117542 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b"] Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.118887 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.120370 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b"] Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.123309 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-vw4cq" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.123675 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.123858 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.159933 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/71fe428e-199c-422c-8911-79d2a7d27ab1-dbus-socket\") pod \"nmstate-handler-jbdfx\" (UID: \"71fe428e-199c-422c-8911-79d2a7d27ab1\") " pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.160005 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/71fe428e-199c-422c-8911-79d2a7d27ab1-nmstate-lock\") pod \"nmstate-handler-jbdfx\" (UID: \"71fe428e-199c-422c-8911-79d2a7d27ab1\") " pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.160101 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/89298ce0-9e0a-4351-96a9-4b69233c7ba0-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-tqw7b\" (UID: \"89298ce0-9e0a-4351-96a9-4b69233c7ba0\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.160083 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/71fe428e-199c-422c-8911-79d2a7d27ab1-nmstate-lock\") pod \"nmstate-handler-jbdfx\" (UID: \"71fe428e-199c-422c-8911-79d2a7d27ab1\") " pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.160236 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/564d8757-3a04-48f3-b3a2-109930f83a10-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-qz8gz\" (UID: \"564d8757-3a04-48f3-b3a2-109930f83a10\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.160245 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/71fe428e-199c-422c-8911-79d2a7d27ab1-dbus-socket\") pod \"nmstate-handler-jbdfx\" (UID: \"71fe428e-199c-422c-8911-79d2a7d27ab1\") " pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.160326 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmvvn\" (UniqueName: \"kubernetes.io/projected/89298ce0-9e0a-4351-96a9-4b69233c7ba0-kube-api-access-vmvvn\") pod \"nmstate-console-plugin-5874bd7bc5-tqw7b\" (UID: \"89298ce0-9e0a-4351-96a9-4b69233c7ba0\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.160380 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d54pb\" (UniqueName: \"kubernetes.io/projected/71fe428e-199c-422c-8911-79d2a7d27ab1-kube-api-access-d54pb\") pod \"nmstate-handler-jbdfx\" (UID: \"71fe428e-199c-422c-8911-79d2a7d27ab1\") " pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.160407 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/71fe428e-199c-422c-8911-79d2a7d27ab1-ovs-socket\") pod \"nmstate-handler-jbdfx\" (UID: \"71fe428e-199c-422c-8911-79d2a7d27ab1\") " pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.160472 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-758jc\" (UniqueName: \"kubernetes.io/projected/d0a1ac8a-df66-4ac5-9aed-a2001c905f21-kube-api-access-758jc\") pod \"nmstate-metrics-5dcf9c57c5-xf2lb\" (UID: \"d0a1ac8a-df66-4ac5-9aed-a2001c905f21\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-xf2lb" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.160562 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/71fe428e-199c-422c-8911-79d2a7d27ab1-ovs-socket\") pod \"nmstate-handler-jbdfx\" (UID: \"71fe428e-199c-422c-8911-79d2a7d27ab1\") " pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.160612 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntk79\" (UniqueName: \"kubernetes.io/projected/564d8757-3a04-48f3-b3a2-109930f83a10-kube-api-access-ntk79\") pod \"nmstate-webhook-6b89b748d8-qz8gz\" (UID: \"564d8757-3a04-48f3-b3a2-109930f83a10\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.160922 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/89298ce0-9e0a-4351-96a9-4b69233c7ba0-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-tqw7b\" (UID: \"89298ce0-9e0a-4351-96a9-4b69233c7ba0\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.174485 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntk79\" (UniqueName: \"kubernetes.io/projected/564d8757-3a04-48f3-b3a2-109930f83a10-kube-api-access-ntk79\") pod \"nmstate-webhook-6b89b748d8-qz8gz\" (UID: \"564d8757-3a04-48f3-b3a2-109930f83a10\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.174935 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d54pb\" (UniqueName: \"kubernetes.io/projected/71fe428e-199c-422c-8911-79d2a7d27ab1-kube-api-access-d54pb\") pod \"nmstate-handler-jbdfx\" (UID: \"71fe428e-199c-422c-8911-79d2a7d27ab1\") " pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.176043 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-758jc\" (UniqueName: \"kubernetes.io/projected/d0a1ac8a-df66-4ac5-9aed-a2001c905f21-kube-api-access-758jc\") pod \"nmstate-metrics-5dcf9c57c5-xf2lb\" (UID: \"d0a1ac8a-df66-4ac5-9aed-a2001c905f21\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-xf2lb" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.178461 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/564d8757-3a04-48f3-b3a2-109930f83a10-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-qz8gz\" (UID: \"564d8757-3a04-48f3-b3a2-109930f83a10\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.262429 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/89298ce0-9e0a-4351-96a9-4b69233c7ba0-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-tqw7b\" (UID: \"89298ce0-9e0a-4351-96a9-4b69233c7ba0\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.262509 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/89298ce0-9e0a-4351-96a9-4b69233c7ba0-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-tqw7b\" (UID: \"89298ce0-9e0a-4351-96a9-4b69233c7ba0\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.262541 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmvvn\" (UniqueName: \"kubernetes.io/projected/89298ce0-9e0a-4351-96a9-4b69233c7ba0-kube-api-access-vmvvn\") pod \"nmstate-console-plugin-5874bd7bc5-tqw7b\" (UID: \"89298ce0-9e0a-4351-96a9-4b69233c7ba0\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.263801 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/89298ce0-9e0a-4351-96a9-4b69233c7ba0-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-tqw7b\" (UID: \"89298ce0-9e0a-4351-96a9-4b69233c7ba0\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.267143 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/89298ce0-9e0a-4351-96a9-4b69233c7ba0-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-tqw7b\" (UID: \"89298ce0-9e0a-4351-96a9-4b69233c7ba0\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.275873 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f5d97b9d-d474f"] Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.276690 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.287378 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmvvn\" (UniqueName: \"kubernetes.io/projected/89298ce0-9e0a-4351-96a9-4b69233c7ba0-kube-api-access-vmvvn\") pod \"nmstate-console-plugin-5874bd7bc5-tqw7b\" (UID: \"89298ce0-9e0a-4351-96a9-4b69233c7ba0\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.289160 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f5d97b9d-d474f"] Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.315711 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-xf2lb" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.323985 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.349612 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.363214 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c840d193-9418-46bf-8271-d03d57fb14c0-service-ca\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.363274 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c840d193-9418-46bf-8271-d03d57fb14c0-console-serving-cert\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.363300 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4sj9\" (UniqueName: \"kubernetes.io/projected/c840d193-9418-46bf-8271-d03d57fb14c0-kube-api-access-z4sj9\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.363322 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c840d193-9418-46bf-8271-d03d57fb14c0-console-config\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.363370 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c840d193-9418-46bf-8271-d03d57fb14c0-trusted-ca-bundle\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.363396 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c840d193-9418-46bf-8271-d03d57fb14c0-console-oauth-config\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.363416 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c840d193-9418-46bf-8271-d03d57fb14c0-oauth-serving-cert\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.431261 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.465126 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c840d193-9418-46bf-8271-d03d57fb14c0-console-oauth-config\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.465172 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c840d193-9418-46bf-8271-d03d57fb14c0-oauth-serving-cert\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.465202 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c840d193-9418-46bf-8271-d03d57fb14c0-service-ca\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.465235 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c840d193-9418-46bf-8271-d03d57fb14c0-console-serving-cert\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.465261 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4sj9\" (UniqueName: \"kubernetes.io/projected/c840d193-9418-46bf-8271-d03d57fb14c0-kube-api-access-z4sj9\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.465292 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c840d193-9418-46bf-8271-d03d57fb14c0-console-config\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.465322 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c840d193-9418-46bf-8271-d03d57fb14c0-trusted-ca-bundle\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.466394 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c840d193-9418-46bf-8271-d03d57fb14c0-oauth-serving-cert\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.466482 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c840d193-9418-46bf-8271-d03d57fb14c0-trusted-ca-bundle\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.466492 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c840d193-9418-46bf-8271-d03d57fb14c0-console-config\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.466846 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c840d193-9418-46bf-8271-d03d57fb14c0-service-ca\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.471541 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c840d193-9418-46bf-8271-d03d57fb14c0-console-oauth-config\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.471852 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c840d193-9418-46bf-8271-d03d57fb14c0-console-serving-cert\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.479982 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4sj9\" (UniqueName: \"kubernetes.io/projected/c840d193-9418-46bf-8271-d03d57fb14c0-kube-api-access-z4sj9\") pod \"console-7f5d97b9d-d474f\" (UID: \"c840d193-9418-46bf-8271-d03d57fb14c0\") " pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.500387 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-xf2lb"] Nov 24 09:13:29 crc kubenswrapper[4563]: W1124 09:13:29.509554 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0a1ac8a_df66_4ac5_9aed_a2001c905f21.slice/crio-5ff538f461c03c1ac14f9eb1e7abe4f9db8e40f60ea7c9d6ce80e8f3e43d74d7 WatchSource:0}: Error finding container 5ff538f461c03c1ac14f9eb1e7abe4f9db8e40f60ea7c9d6ce80e8f3e43d74d7: Status 404 returned error can't find the container with id 5ff538f461c03c1ac14f9eb1e7abe4f9db8e40f60ea7c9d6ce80e8f3e43d74d7 Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.604524 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.669560 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz"] Nov 24 09:13:29 crc kubenswrapper[4563]: W1124 09:13:29.676275 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod564d8757_3a04_48f3_b3a2_109930f83a10.slice/crio-6c7fbd7512976bf5ce2e0d7846454a533d953b4d0fba00b27f57afc1813019ae WatchSource:0}: Error finding container 6c7fbd7512976bf5ce2e0d7846454a533d953b4d0fba00b27f57afc1813019ae: Status 404 returned error can't find the container with id 6c7fbd7512976bf5ce2e0d7846454a533d953b4d0fba00b27f57afc1813019ae Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.772503 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b"] Nov 24 09:13:29 crc kubenswrapper[4563]: W1124 09:13:29.776700 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89298ce0_9e0a_4351_96a9_4b69233c7ba0.slice/crio-e9c22cc2aad9771153eae546de0bc9e66b89ca6577b3ec97c05b019643e2231b WatchSource:0}: Error finding container e9c22cc2aad9771153eae546de0bc9e66b89ca6577b3ec97c05b019643e2231b: Status 404 returned error can't find the container with id e9c22cc2aad9771153eae546de0bc9e66b89ca6577b3ec97c05b019643e2231b Nov 24 09:13:29 crc kubenswrapper[4563]: I1124 09:13:29.940758 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f5d97b9d-d474f"] Nov 24 09:13:29 crc kubenswrapper[4563]: W1124 09:13:29.944013 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc840d193_9418_46bf_8271_d03d57fb14c0.slice/crio-1480ee7b4df4f89dca0cadbe4de891ce6eb5d864726950699b883c6fe4141825 WatchSource:0}: Error finding container 1480ee7b4df4f89dca0cadbe4de891ce6eb5d864726950699b883c6fe4141825: Status 404 returned error can't find the container with id 1480ee7b4df4f89dca0cadbe4de891ce6eb5d864726950699b883c6fe4141825 Nov 24 09:13:30 crc kubenswrapper[4563]: I1124 09:13:30.102672 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jbdfx" event={"ID":"71fe428e-199c-422c-8911-79d2a7d27ab1","Type":"ContainerStarted","Data":"d06f9044fee949f4d198802ce755af85941ec3bb1d5cec2cfe72c0636cb69e08"} Nov 24 09:13:30 crc kubenswrapper[4563]: I1124 09:13:30.103489 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-xf2lb" event={"ID":"d0a1ac8a-df66-4ac5-9aed-a2001c905f21","Type":"ContainerStarted","Data":"5ff538f461c03c1ac14f9eb1e7abe4f9db8e40f60ea7c9d6ce80e8f3e43d74d7"} Nov 24 09:13:30 crc kubenswrapper[4563]: I1124 09:13:30.104707 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5d97b9d-d474f" event={"ID":"c840d193-9418-46bf-8271-d03d57fb14c0","Type":"ContainerStarted","Data":"638990d2f66abcf282327aef3ca7480f407ede116a0a21cb49a8cd54d42e36ec"} Nov 24 09:13:30 crc kubenswrapper[4563]: I1124 09:13:30.104757 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5d97b9d-d474f" event={"ID":"c840d193-9418-46bf-8271-d03d57fb14c0","Type":"ContainerStarted","Data":"1480ee7b4df4f89dca0cadbe4de891ce6eb5d864726950699b883c6fe4141825"} Nov 24 09:13:30 crc kubenswrapper[4563]: I1124 09:13:30.105530 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b" event={"ID":"89298ce0-9e0a-4351-96a9-4b69233c7ba0","Type":"ContainerStarted","Data":"e9c22cc2aad9771153eae546de0bc9e66b89ca6577b3ec97c05b019643e2231b"} Nov 24 09:13:30 crc kubenswrapper[4563]: I1124 09:13:30.106213 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz" event={"ID":"564d8757-3a04-48f3-b3a2-109930f83a10","Type":"ContainerStarted","Data":"6c7fbd7512976bf5ce2e0d7846454a533d953b4d0fba00b27f57afc1813019ae"} Nov 24 09:13:30 crc kubenswrapper[4563]: I1124 09:13:30.116510 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f5d97b9d-d474f" podStartSLOduration=1.116497967 podStartE2EDuration="1.116497967s" podCreationTimestamp="2025-11-24 09:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:13:30.115711855 +0000 UTC m=+587.374689301" watchObservedRunningTime="2025-11-24 09:13:30.116497967 +0000 UTC m=+587.375475414" Nov 24 09:13:32 crc kubenswrapper[4563]: I1124 09:13:32.118437 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jbdfx" event={"ID":"71fe428e-199c-422c-8911-79d2a7d27ab1","Type":"ContainerStarted","Data":"154809d7d938257bec4d8f3b065d474de11873d70716bc0bc8c35f102313d2ea"} Nov 24 09:13:32 crc kubenswrapper[4563]: I1124 09:13:32.118999 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:32 crc kubenswrapper[4563]: I1124 09:13:32.120118 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-xf2lb" event={"ID":"d0a1ac8a-df66-4ac5-9aed-a2001c905f21","Type":"ContainerStarted","Data":"11c39a8cf83cc6196d395eeab0636e094c8e2806196680816ce5be9c9847acec"} Nov 24 09:13:32 crc kubenswrapper[4563]: I1124 09:13:32.121691 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz" event={"ID":"564d8757-3a04-48f3-b3a2-109930f83a10","Type":"ContainerStarted","Data":"c33f07d8c67f117900277a84cacb32c5d9e8e8480e9dc60704f59ef01998ce7e"} Nov 24 09:13:32 crc kubenswrapper[4563]: I1124 09:13:32.121825 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz" Nov 24 09:13:32 crc kubenswrapper[4563]: I1124 09:13:32.131890 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-jbdfx" podStartSLOduration=0.915187577 podStartE2EDuration="3.131876597s" podCreationTimestamp="2025-11-24 09:13:29 +0000 UTC" firstStartedPulling="2025-11-24 09:13:29.36635638 +0000 UTC m=+586.625333818" lastFinishedPulling="2025-11-24 09:13:31.583045392 +0000 UTC m=+588.842022838" observedRunningTime="2025-11-24 09:13:32.129369379 +0000 UTC m=+589.388346826" watchObservedRunningTime="2025-11-24 09:13:32.131876597 +0000 UTC m=+589.390854044" Nov 24 09:13:32 crc kubenswrapper[4563]: I1124 09:13:32.143608 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz" podStartSLOduration=2.246931169 podStartE2EDuration="4.143597631s" podCreationTimestamp="2025-11-24 09:13:28 +0000 UTC" firstStartedPulling="2025-11-24 09:13:29.678211085 +0000 UTC m=+586.937188533" lastFinishedPulling="2025-11-24 09:13:31.574877549 +0000 UTC m=+588.833854995" observedRunningTime="2025-11-24 09:13:32.139163799 +0000 UTC m=+589.398141246" watchObservedRunningTime="2025-11-24 09:13:32.143597631 +0000 UTC m=+589.402575078" Nov 24 09:13:33 crc kubenswrapper[4563]: I1124 09:13:33.131105 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b" event={"ID":"89298ce0-9e0a-4351-96a9-4b69233c7ba0","Type":"ContainerStarted","Data":"da33dee62a7ffd7ca7084fce006b0226138710d863eb0cc98a14ed03e42ff755"} Nov 24 09:13:33 crc kubenswrapper[4563]: I1124 09:13:33.143460 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-tqw7b" podStartSLOduration=1.6292285130000002 podStartE2EDuration="4.143422149s" podCreationTimestamp="2025-11-24 09:13:29 +0000 UTC" firstStartedPulling="2025-11-24 09:13:29.778534966 +0000 UTC m=+587.037512413" lastFinishedPulling="2025-11-24 09:13:32.292728602 +0000 UTC m=+589.551706049" observedRunningTime="2025-11-24 09:13:33.142487586 +0000 UTC m=+590.401465033" watchObservedRunningTime="2025-11-24 09:13:33.143422149 +0000 UTC m=+590.402399595" Nov 24 09:13:34 crc kubenswrapper[4563]: I1124 09:13:34.136337 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-xf2lb" event={"ID":"d0a1ac8a-df66-4ac5-9aed-a2001c905f21","Type":"ContainerStarted","Data":"b8b4b5b10e416ec79e1d85f30d3696529e272fbfb899063501432fd74520c360"} Nov 24 09:13:34 crc kubenswrapper[4563]: I1124 09:13:34.151750 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-xf2lb" podStartSLOduration=2.236696337 podStartE2EDuration="6.151735876s" podCreationTimestamp="2025-11-24 09:13:28 +0000 UTC" firstStartedPulling="2025-11-24 09:13:29.511919103 +0000 UTC m=+586.770896549" lastFinishedPulling="2025-11-24 09:13:33.426958642 +0000 UTC m=+590.685936088" observedRunningTime="2025-11-24 09:13:34.147239357 +0000 UTC m=+591.406216804" watchObservedRunningTime="2025-11-24 09:13:34.151735876 +0000 UTC m=+591.410713324" Nov 24 09:13:38 crc kubenswrapper[4563]: I1124 09:13:38.987424 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:13:38 crc kubenswrapper[4563]: I1124 09:13:38.987731 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:13:39 crc kubenswrapper[4563]: I1124 09:13:39.367823 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-jbdfx" Nov 24 09:13:39 crc kubenswrapper[4563]: I1124 09:13:39.605134 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:39 crc kubenswrapper[4563]: I1124 09:13:39.605184 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:39 crc kubenswrapper[4563]: I1124 09:13:39.608627 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:40 crc kubenswrapper[4563]: I1124 09:13:40.162525 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f5d97b9d-d474f" Nov 24 09:13:40 crc kubenswrapper[4563]: I1124 09:13:40.192686 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7hx7w"] Nov 24 09:13:49 crc kubenswrapper[4563]: I1124 09:13:49.328033 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-qz8gz" Nov 24 09:13:58 crc kubenswrapper[4563]: I1124 09:13:58.801009 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc"] Nov 24 09:13:58 crc kubenswrapper[4563]: I1124 09:13:58.802207 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" Nov 24 09:13:58 crc kubenswrapper[4563]: I1124 09:13:58.803780 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 24 09:13:58 crc kubenswrapper[4563]: I1124 09:13:58.808408 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc"] Nov 24 09:13:58 crc kubenswrapper[4563]: I1124 09:13:58.975551 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e79caaa-09f9-4720-b80d-300d880d7e26-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc\" (UID: \"2e79caaa-09f9-4720-b80d-300d880d7e26\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" Nov 24 09:13:58 crc kubenswrapper[4563]: I1124 09:13:58.975665 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smn54\" (UniqueName: \"kubernetes.io/projected/2e79caaa-09f9-4720-b80d-300d880d7e26-kube-api-access-smn54\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc\" (UID: \"2e79caaa-09f9-4720-b80d-300d880d7e26\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" Nov 24 09:13:58 crc kubenswrapper[4563]: I1124 09:13:58.975713 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e79caaa-09f9-4720-b80d-300d880d7e26-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc\" (UID: \"2e79caaa-09f9-4720-b80d-300d880d7e26\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" Nov 24 09:13:59 crc kubenswrapper[4563]: I1124 09:13:59.076316 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e79caaa-09f9-4720-b80d-300d880d7e26-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc\" (UID: \"2e79caaa-09f9-4720-b80d-300d880d7e26\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" Nov 24 09:13:59 crc kubenswrapper[4563]: I1124 09:13:59.076753 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e79caaa-09f9-4720-b80d-300d880d7e26-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc\" (UID: \"2e79caaa-09f9-4720-b80d-300d880d7e26\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" Nov 24 09:13:59 crc kubenswrapper[4563]: I1124 09:13:59.076759 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smn54\" (UniqueName: \"kubernetes.io/projected/2e79caaa-09f9-4720-b80d-300d880d7e26-kube-api-access-smn54\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc\" (UID: \"2e79caaa-09f9-4720-b80d-300d880d7e26\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" Nov 24 09:13:59 crc kubenswrapper[4563]: I1124 09:13:59.076846 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e79caaa-09f9-4720-b80d-300d880d7e26-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc\" (UID: \"2e79caaa-09f9-4720-b80d-300d880d7e26\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" Nov 24 09:13:59 crc kubenswrapper[4563]: I1124 09:13:59.077259 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e79caaa-09f9-4720-b80d-300d880d7e26-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc\" (UID: \"2e79caaa-09f9-4720-b80d-300d880d7e26\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" Nov 24 09:13:59 crc kubenswrapper[4563]: I1124 09:13:59.090752 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smn54\" (UniqueName: \"kubernetes.io/projected/2e79caaa-09f9-4720-b80d-300d880d7e26-kube-api-access-smn54\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc\" (UID: \"2e79caaa-09f9-4720-b80d-300d880d7e26\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" Nov 24 09:13:59 crc kubenswrapper[4563]: I1124 09:13:59.113168 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" Nov 24 09:13:59 crc kubenswrapper[4563]: I1124 09:13:59.445926 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc"] Nov 24 09:14:00 crc kubenswrapper[4563]: I1124 09:14:00.242535 4563 generic.go:334] "Generic (PLEG): container finished" podID="2e79caaa-09f9-4720-b80d-300d880d7e26" containerID="71c530c367ae64e5c5f5537f32537d0f0620a9240212945e3fc93021121df6b7" exitCode=0 Nov 24 09:14:00 crc kubenswrapper[4563]: I1124 09:14:00.242573 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" event={"ID":"2e79caaa-09f9-4720-b80d-300d880d7e26","Type":"ContainerDied","Data":"71c530c367ae64e5c5f5537f32537d0f0620a9240212945e3fc93021121df6b7"} Nov 24 09:14:00 crc kubenswrapper[4563]: I1124 09:14:00.242606 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" event={"ID":"2e79caaa-09f9-4720-b80d-300d880d7e26","Type":"ContainerStarted","Data":"625210d2369efce46a5fe534e73e36b269f9cdb1453e80f39a281afbf8a57207"} Nov 24 09:14:02 crc kubenswrapper[4563]: I1124 09:14:02.252284 4563 generic.go:334] "Generic (PLEG): container finished" podID="2e79caaa-09f9-4720-b80d-300d880d7e26" containerID="1243ebce648e4c55621d3bef8f28aebb2da748724ca78611684b88982450db4c" exitCode=0 Nov 24 09:14:02 crc kubenswrapper[4563]: I1124 09:14:02.252334 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" event={"ID":"2e79caaa-09f9-4720-b80d-300d880d7e26","Type":"ContainerDied","Data":"1243ebce648e4c55621d3bef8f28aebb2da748724ca78611684b88982450db4c"} Nov 24 09:14:03 crc kubenswrapper[4563]: I1124 09:14:03.256445 4563 generic.go:334] "Generic (PLEG): container finished" podID="2e79caaa-09f9-4720-b80d-300d880d7e26" containerID="900a1b96a351f457ae52bce706a3e2e91c16f5c558087c0203c5f24c00cda4b6" exitCode=0 Nov 24 09:14:03 crc kubenswrapper[4563]: I1124 09:14:03.256485 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" event={"ID":"2e79caaa-09f9-4720-b80d-300d880d7e26","Type":"ContainerDied","Data":"900a1b96a351f457ae52bce706a3e2e91c16f5c558087c0203c5f24c00cda4b6"} Nov 24 09:14:04 crc kubenswrapper[4563]: I1124 09:14:04.415145 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" Nov 24 09:14:04 crc kubenswrapper[4563]: I1124 09:14:04.536858 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smn54\" (UniqueName: \"kubernetes.io/projected/2e79caaa-09f9-4720-b80d-300d880d7e26-kube-api-access-smn54\") pod \"2e79caaa-09f9-4720-b80d-300d880d7e26\" (UID: \"2e79caaa-09f9-4720-b80d-300d880d7e26\") " Nov 24 09:14:04 crc kubenswrapper[4563]: I1124 09:14:04.536938 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e79caaa-09f9-4720-b80d-300d880d7e26-bundle\") pod \"2e79caaa-09f9-4720-b80d-300d880d7e26\" (UID: \"2e79caaa-09f9-4720-b80d-300d880d7e26\") " Nov 24 09:14:04 crc kubenswrapper[4563]: I1124 09:14:04.536971 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e79caaa-09f9-4720-b80d-300d880d7e26-util\") pod \"2e79caaa-09f9-4720-b80d-300d880d7e26\" (UID: \"2e79caaa-09f9-4720-b80d-300d880d7e26\") " Nov 24 09:14:04 crc kubenswrapper[4563]: I1124 09:14:04.537715 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e79caaa-09f9-4720-b80d-300d880d7e26-bundle" (OuterVolumeSpecName: "bundle") pod "2e79caaa-09f9-4720-b80d-300d880d7e26" (UID: "2e79caaa-09f9-4720-b80d-300d880d7e26"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:14:04 crc kubenswrapper[4563]: I1124 09:14:04.541132 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e79caaa-09f9-4720-b80d-300d880d7e26-kube-api-access-smn54" (OuterVolumeSpecName: "kube-api-access-smn54") pod "2e79caaa-09f9-4720-b80d-300d880d7e26" (UID: "2e79caaa-09f9-4720-b80d-300d880d7e26"). InnerVolumeSpecName "kube-api-access-smn54". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:14:04 crc kubenswrapper[4563]: I1124 09:14:04.547363 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e79caaa-09f9-4720-b80d-300d880d7e26-util" (OuterVolumeSpecName: "util") pod "2e79caaa-09f9-4720-b80d-300d880d7e26" (UID: "2e79caaa-09f9-4720-b80d-300d880d7e26"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:14:04 crc kubenswrapper[4563]: I1124 09:14:04.638084 4563 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e79caaa-09f9-4720-b80d-300d880d7e26-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:14:04 crc kubenswrapper[4563]: I1124 09:14:04.638108 4563 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e79caaa-09f9-4720-b80d-300d880d7e26-util\") on node \"crc\" DevicePath \"\"" Nov 24 09:14:04 crc kubenswrapper[4563]: I1124 09:14:04.638116 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smn54\" (UniqueName: \"kubernetes.io/projected/2e79caaa-09f9-4720-b80d-300d880d7e26-kube-api-access-smn54\") on node \"crc\" DevicePath \"\"" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.220240 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-7hx7w" podUID="66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" containerName="console" containerID="cri-o://c8250f0446ef98d65787e3a58b19b297a82346cd3ed13878e276419ecf3ba3da" gracePeriod=15 Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.264073 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" event={"ID":"2e79caaa-09f9-4720-b80d-300d880d7e26","Type":"ContainerDied","Data":"625210d2369efce46a5fe534e73e36b269f9cdb1453e80f39a281afbf8a57207"} Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.264109 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="625210d2369efce46a5fe534e73e36b269f9cdb1453e80f39a281afbf8a57207" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.264118 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.487624 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7hx7w_66a93c10-4a3e-4c5a-a0b0-ace9213e4a28/console/0.log" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.487709 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.647288 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-oauth-config\") pod \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.647554 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-serving-cert\") pod \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.647577 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hrrj\" (UniqueName: \"kubernetes.io/projected/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-kube-api-access-5hrrj\") pod \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.647600 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-trusted-ca-bundle\") pod \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.647657 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-oauth-serving-cert\") pod \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.647687 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-config\") pod \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.647705 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-service-ca\") pod \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\" (UID: \"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28\") " Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.648291 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-service-ca" (OuterVolumeSpecName: "service-ca") pod "66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" (UID: "66a93c10-4a3e-4c5a-a0b0-ace9213e4a28"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.648318 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" (UID: "66a93c10-4a3e-4c5a-a0b0-ace9213e4a28"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.648296 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-config" (OuterVolumeSpecName: "console-config") pod "66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" (UID: "66a93c10-4a3e-4c5a-a0b0-ace9213e4a28"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.648674 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" (UID: "66a93c10-4a3e-4c5a-a0b0-ace9213e4a28"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.651710 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-kube-api-access-5hrrj" (OuterVolumeSpecName: "kube-api-access-5hrrj") pod "66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" (UID: "66a93c10-4a3e-4c5a-a0b0-ace9213e4a28"). InnerVolumeSpecName "kube-api-access-5hrrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.651928 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" (UID: "66a93c10-4a3e-4c5a-a0b0-ace9213e4a28"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.652281 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" (UID: "66a93c10-4a3e-4c5a-a0b0-ace9213e4a28"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.748995 4563 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.749026 4563 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.749036 4563 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-service-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.749044 4563 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.749065 4563 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.749075 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hrrj\" (UniqueName: \"kubernetes.io/projected/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-kube-api-access-5hrrj\") on node \"crc\" DevicePath \"\"" Nov 24 09:14:05 crc kubenswrapper[4563]: I1124 09:14:05.749084 4563 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:14:06 crc kubenswrapper[4563]: I1124 09:14:06.268825 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7hx7w_66a93c10-4a3e-4c5a-a0b0-ace9213e4a28/console/0.log" Nov 24 09:14:06 crc kubenswrapper[4563]: I1124 09:14:06.269005 4563 generic.go:334] "Generic (PLEG): container finished" podID="66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" containerID="c8250f0446ef98d65787e3a58b19b297a82346cd3ed13878e276419ecf3ba3da" exitCode=2 Nov 24 09:14:06 crc kubenswrapper[4563]: I1124 09:14:06.269049 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7hx7w" event={"ID":"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28","Type":"ContainerDied","Data":"c8250f0446ef98d65787e3a58b19b297a82346cd3ed13878e276419ecf3ba3da"} Nov 24 09:14:06 crc kubenswrapper[4563]: I1124 09:14:06.269078 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7hx7w" Nov 24 09:14:06 crc kubenswrapper[4563]: I1124 09:14:06.269248 4563 scope.go:117] "RemoveContainer" containerID="c8250f0446ef98d65787e3a58b19b297a82346cd3ed13878e276419ecf3ba3da" Nov 24 09:14:06 crc kubenswrapper[4563]: I1124 09:14:06.269188 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7hx7w" event={"ID":"66a93c10-4a3e-4c5a-a0b0-ace9213e4a28","Type":"ContainerDied","Data":"92799b1e34eeeee51c6e4aaaa6b0b28eae0a13ca48fff82f24d4c1dd82f733f7"} Nov 24 09:14:06 crc kubenswrapper[4563]: I1124 09:14:06.280365 4563 scope.go:117] "RemoveContainer" containerID="c8250f0446ef98d65787e3a58b19b297a82346cd3ed13878e276419ecf3ba3da" Nov 24 09:14:06 crc kubenswrapper[4563]: E1124 09:14:06.280585 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8250f0446ef98d65787e3a58b19b297a82346cd3ed13878e276419ecf3ba3da\": container with ID starting with c8250f0446ef98d65787e3a58b19b297a82346cd3ed13878e276419ecf3ba3da not found: ID does not exist" containerID="c8250f0446ef98d65787e3a58b19b297a82346cd3ed13878e276419ecf3ba3da" Nov 24 09:14:06 crc kubenswrapper[4563]: I1124 09:14:06.280612 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8250f0446ef98d65787e3a58b19b297a82346cd3ed13878e276419ecf3ba3da"} err="failed to get container status \"c8250f0446ef98d65787e3a58b19b297a82346cd3ed13878e276419ecf3ba3da\": rpc error: code = NotFound desc = could not find container \"c8250f0446ef98d65787e3a58b19b297a82346cd3ed13878e276419ecf3ba3da\": container with ID starting with c8250f0446ef98d65787e3a58b19b297a82346cd3ed13878e276419ecf3ba3da not found: ID does not exist" Nov 24 09:14:06 crc kubenswrapper[4563]: I1124 09:14:06.286782 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7hx7w"] Nov 24 09:14:06 crc kubenswrapper[4563]: I1124 09:14:06.288938 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-7hx7w"] Nov 24 09:14:07 crc kubenswrapper[4563]: I1124 09:14:07.059324 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" path="/var/lib/kubelet/pods/66a93c10-4a3e-4c5a-a0b0-ace9213e4a28/volumes" Nov 24 09:14:08 crc kubenswrapper[4563]: I1124 09:14:08.987541 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:14:08 crc kubenswrapper[4563]: I1124 09:14:08.988336 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:14:08 crc kubenswrapper[4563]: I1124 09:14:08.988450 4563 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:14:08 crc kubenswrapper[4563]: I1124 09:14:08.989045 4563 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"70f9c1996df47923e8f844e33d7bfcc71ba8679e22cd73a43ba1a4424e2bc93b"} pod="openshift-machine-config-operator/machine-config-daemon-stlxr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 09:14:08 crc kubenswrapper[4563]: I1124 09:14:08.989190 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" containerID="cri-o://70f9c1996df47923e8f844e33d7bfcc71ba8679e22cd73a43ba1a4424e2bc93b" gracePeriod=600 Nov 24 09:14:09 crc kubenswrapper[4563]: I1124 09:14:09.285866 4563 generic.go:334] "Generic (PLEG): container finished" podID="3b2bfe55-8989-49b3-bb61-e28189447627" containerID="70f9c1996df47923e8f844e33d7bfcc71ba8679e22cd73a43ba1a4424e2bc93b" exitCode=0 Nov 24 09:14:09 crc kubenswrapper[4563]: I1124 09:14:09.285958 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerDied","Data":"70f9c1996df47923e8f844e33d7bfcc71ba8679e22cd73a43ba1a4424e2bc93b"} Nov 24 09:14:09 crc kubenswrapper[4563]: I1124 09:14:09.286091 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"6964b83d094213dce29e8bc08bcdab313730109f615c70e3b48ab3147ba318f2"} Nov 24 09:14:09 crc kubenswrapper[4563]: I1124 09:14:09.286111 4563 scope.go:117] "RemoveContainer" containerID="693aaa2fd38048eca425d5cf8bf8e834a76fe87db5bd736efd4bf0270f272397" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.027176 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9"] Nov 24 09:14:14 crc kubenswrapper[4563]: E1124 09:14:14.027727 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e79caaa-09f9-4720-b80d-300d880d7e26" containerName="extract" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.027741 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e79caaa-09f9-4720-b80d-300d880d7e26" containerName="extract" Nov 24 09:14:14 crc kubenswrapper[4563]: E1124 09:14:14.027756 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e79caaa-09f9-4720-b80d-300d880d7e26" containerName="pull" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.027761 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e79caaa-09f9-4720-b80d-300d880d7e26" containerName="pull" Nov 24 09:14:14 crc kubenswrapper[4563]: E1124 09:14:14.027771 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e79caaa-09f9-4720-b80d-300d880d7e26" containerName="util" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.027777 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e79caaa-09f9-4720-b80d-300d880d7e26" containerName="util" Nov 24 09:14:14 crc kubenswrapper[4563]: E1124 09:14:14.027786 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" containerName="console" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.027792 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" containerName="console" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.027883 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="66a93c10-4a3e-4c5a-a0b0-ace9213e4a28" containerName="console" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.027892 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e79caaa-09f9-4720-b80d-300d880d7e26" containerName="extract" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.028256 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.031212 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.031228 4563 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.031382 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.031393 4563 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.031467 4563 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-dlk2l" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.034822 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5784d9be-5a59-4204-829a-dc637bfb7d90-webhook-cert\") pod \"metallb-operator-controller-manager-7cd8c86c8-w62w9\" (UID: \"5784d9be-5a59-4204-829a-dc637bfb7d90\") " pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.035100 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5784d9be-5a59-4204-829a-dc637bfb7d90-apiservice-cert\") pod \"metallb-operator-controller-manager-7cd8c86c8-w62w9\" (UID: \"5784d9be-5a59-4204-829a-dc637bfb7d90\") " pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.035173 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljw9d\" (UniqueName: \"kubernetes.io/projected/5784d9be-5a59-4204-829a-dc637bfb7d90-kube-api-access-ljw9d\") pod \"metallb-operator-controller-manager-7cd8c86c8-w62w9\" (UID: \"5784d9be-5a59-4204-829a-dc637bfb7d90\") " pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.043961 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9"] Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.135940 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljw9d\" (UniqueName: \"kubernetes.io/projected/5784d9be-5a59-4204-829a-dc637bfb7d90-kube-api-access-ljw9d\") pod \"metallb-operator-controller-manager-7cd8c86c8-w62w9\" (UID: \"5784d9be-5a59-4204-829a-dc637bfb7d90\") " pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.135986 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5784d9be-5a59-4204-829a-dc637bfb7d90-webhook-cert\") pod \"metallb-operator-controller-manager-7cd8c86c8-w62w9\" (UID: \"5784d9be-5a59-4204-829a-dc637bfb7d90\") " pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.136039 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5784d9be-5a59-4204-829a-dc637bfb7d90-apiservice-cert\") pod \"metallb-operator-controller-manager-7cd8c86c8-w62w9\" (UID: \"5784d9be-5a59-4204-829a-dc637bfb7d90\") " pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.141609 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5784d9be-5a59-4204-829a-dc637bfb7d90-webhook-cert\") pod \"metallb-operator-controller-manager-7cd8c86c8-w62w9\" (UID: \"5784d9be-5a59-4204-829a-dc637bfb7d90\") " pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.141607 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5784d9be-5a59-4204-829a-dc637bfb7d90-apiservice-cert\") pod \"metallb-operator-controller-manager-7cd8c86c8-w62w9\" (UID: \"5784d9be-5a59-4204-829a-dc637bfb7d90\") " pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.154358 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljw9d\" (UniqueName: \"kubernetes.io/projected/5784d9be-5a59-4204-829a-dc637bfb7d90-kube-api-access-ljw9d\") pod \"metallb-operator-controller-manager-7cd8c86c8-w62w9\" (UID: \"5784d9be-5a59-4204-829a-dc637bfb7d90\") " pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.165612 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-559746f898-fwz9n"] Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.166178 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.168689 4563 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.168807 4563 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-lc4vh" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.174492 4563 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.178931 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-559746f898-fwz9n"] Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.237311 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/29db6437-f6b7-4f7f-a855-33b7316b09f8-apiservice-cert\") pod \"metallb-operator-webhook-server-559746f898-fwz9n\" (UID: \"29db6437-f6b7-4f7f-a855-33b7316b09f8\") " pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.237492 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/29db6437-f6b7-4f7f-a855-33b7316b09f8-webhook-cert\") pod \"metallb-operator-webhook-server-559746f898-fwz9n\" (UID: \"29db6437-f6b7-4f7f-a855-33b7316b09f8\") " pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.237564 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j66t9\" (UniqueName: \"kubernetes.io/projected/29db6437-f6b7-4f7f-a855-33b7316b09f8-kube-api-access-j66t9\") pod \"metallb-operator-webhook-server-559746f898-fwz9n\" (UID: \"29db6437-f6b7-4f7f-a855-33b7316b09f8\") " pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.338651 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/29db6437-f6b7-4f7f-a855-33b7316b09f8-webhook-cert\") pod \"metallb-operator-webhook-server-559746f898-fwz9n\" (UID: \"29db6437-f6b7-4f7f-a855-33b7316b09f8\") " pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.338917 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j66t9\" (UniqueName: \"kubernetes.io/projected/29db6437-f6b7-4f7f-a855-33b7316b09f8-kube-api-access-j66t9\") pod \"metallb-operator-webhook-server-559746f898-fwz9n\" (UID: \"29db6437-f6b7-4f7f-a855-33b7316b09f8\") " pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.339005 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/29db6437-f6b7-4f7f-a855-33b7316b09f8-apiservice-cert\") pod \"metallb-operator-webhook-server-559746f898-fwz9n\" (UID: \"29db6437-f6b7-4f7f-a855-33b7316b09f8\") " pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.339779 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.341847 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/29db6437-f6b7-4f7f-a855-33b7316b09f8-webhook-cert\") pod \"metallb-operator-webhook-server-559746f898-fwz9n\" (UID: \"29db6437-f6b7-4f7f-a855-33b7316b09f8\") " pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.342192 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/29db6437-f6b7-4f7f-a855-33b7316b09f8-apiservice-cert\") pod \"metallb-operator-webhook-server-559746f898-fwz9n\" (UID: \"29db6437-f6b7-4f7f-a855-33b7316b09f8\") " pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.357602 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j66t9\" (UniqueName: \"kubernetes.io/projected/29db6437-f6b7-4f7f-a855-33b7316b09f8-kube-api-access-j66t9\") pod \"metallb-operator-webhook-server-559746f898-fwz9n\" (UID: \"29db6437-f6b7-4f7f-a855-33b7316b09f8\") " pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.506612 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.711941 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9"] Nov 24 09:14:14 crc kubenswrapper[4563]: I1124 09:14:14.726005 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-559746f898-fwz9n"] Nov 24 09:14:14 crc kubenswrapper[4563]: W1124 09:14:14.732045 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29db6437_f6b7_4f7f_a855_33b7316b09f8.slice/crio-375c2d412066a38c9b891c8649e26add22b1e442d05d9f94540f791ebd84785a WatchSource:0}: Error finding container 375c2d412066a38c9b891c8649e26add22b1e442d05d9f94540f791ebd84785a: Status 404 returned error can't find the container with id 375c2d412066a38c9b891c8649e26add22b1e442d05d9f94540f791ebd84785a Nov 24 09:14:15 crc kubenswrapper[4563]: I1124 09:14:15.312783 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" event={"ID":"29db6437-f6b7-4f7f-a855-33b7316b09f8","Type":"ContainerStarted","Data":"375c2d412066a38c9b891c8649e26add22b1e442d05d9f94540f791ebd84785a"} Nov 24 09:14:15 crc kubenswrapper[4563]: I1124 09:14:15.313575 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" event={"ID":"5784d9be-5a59-4204-829a-dc637bfb7d90","Type":"ContainerStarted","Data":"38fa1674ad03ce720dc7dd8bd0ff47a7ab00fea3667514667a1f57d72ebf2e00"} Nov 24 09:14:19 crc kubenswrapper[4563]: I1124 09:14:19.329384 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" event={"ID":"29db6437-f6b7-4f7f-a855-33b7316b09f8","Type":"ContainerStarted","Data":"78e41cf73ffe1430bc01a508f124d0c48fe40e6bb4feb9bf0f58d3ed43513c17"} Nov 24 09:14:19 crc kubenswrapper[4563]: I1124 09:14:19.329771 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" Nov 24 09:14:19 crc kubenswrapper[4563]: I1124 09:14:19.331365 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" event={"ID":"5784d9be-5a59-4204-829a-dc637bfb7d90","Type":"ContainerStarted","Data":"bcbffa0255245f50ee788912bdb858ab31b96b0c11cee1ae33e1dacf8ba6abeb"} Nov 24 09:14:19 crc kubenswrapper[4563]: I1124 09:14:19.331503 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" Nov 24 09:14:19 crc kubenswrapper[4563]: I1124 09:14:19.345417 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" podStartSLOduration=1.601013155 podStartE2EDuration="5.345402754s" podCreationTimestamp="2025-11-24 09:14:14 +0000 UTC" firstStartedPulling="2025-11-24 09:14:14.734829187 +0000 UTC m=+631.993806634" lastFinishedPulling="2025-11-24 09:14:18.479218785 +0000 UTC m=+635.738196233" observedRunningTime="2025-11-24 09:14:19.342915714 +0000 UTC m=+636.601893161" watchObservedRunningTime="2025-11-24 09:14:19.345402754 +0000 UTC m=+636.604380201" Nov 24 09:14:19 crc kubenswrapper[4563]: I1124 09:14:19.359165 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" podStartSLOduration=1.609694448 podStartE2EDuration="5.359149559s" podCreationTimestamp="2025-11-24 09:14:14 +0000 UTC" firstStartedPulling="2025-11-24 09:14:14.71772922 +0000 UTC m=+631.976706667" lastFinishedPulling="2025-11-24 09:14:18.467184331 +0000 UTC m=+635.726161778" observedRunningTime="2025-11-24 09:14:19.356270379 +0000 UTC m=+636.615247826" watchObservedRunningTime="2025-11-24 09:14:19.359149559 +0000 UTC m=+636.618127006" Nov 24 09:14:34 crc kubenswrapper[4563]: I1124 09:14:34.513107 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-559746f898-fwz9n" Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.342104 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7cd8c86c8-w62w9" Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.917614 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-qwb5s"] Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.919868 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.920455 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-fchts"] Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.921258 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-fchts" Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.921607 4563 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.921788 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.921840 4563 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-n2dw8" Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.922536 4563 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.931784 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-fchts"] Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.974948 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tddpp"] Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.975859 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tddpp" Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.986341 4563 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.986362 4563 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-5rgwd" Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.986842 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-7rql8"] Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.987699 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-7rql8" Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.988854 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.989293 4563 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 24 09:14:54 crc kubenswrapper[4563]: I1124 09:14:54.990630 4563 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.000688 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-7rql8"] Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095196 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/54e0a364-1f3d-493b-8c11-2d59672a99e1-reloader\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095261 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3b0c98e8-df1b-485b-972d-2e2ff8103006-metallb-excludel2\") pod \"speaker-tddpp\" (UID: \"3b0c98e8-df1b-485b-972d-2e2ff8103006\") " pod="metallb-system/speaker-tddpp" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095405 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/54e0a364-1f3d-493b-8c11-2d59672a99e1-metrics\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095491 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ae4470-f488-4bc7-b9e0-a37903b5400a-metrics-certs\") pod \"controller-6c7b4b5f48-7rql8\" (UID: \"e3ae4470-f488-4bc7-b9e0-a37903b5400a\") " pod="metallb-system/controller-6c7b4b5f48-7rql8" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095538 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b0c98e8-df1b-485b-972d-2e2ff8103006-memberlist\") pod \"speaker-tddpp\" (UID: \"3b0c98e8-df1b-485b-972d-2e2ff8103006\") " pod="metallb-system/speaker-tddpp" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095577 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3ae4470-f488-4bc7-b9e0-a37903b5400a-cert\") pod \"controller-6c7b4b5f48-7rql8\" (UID: \"e3ae4470-f488-4bc7-b9e0-a37903b5400a\") " pod="metallb-system/controller-6c7b4b5f48-7rql8" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095623 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxsdb\" (UniqueName: \"kubernetes.io/projected/3b0c98e8-df1b-485b-972d-2e2ff8103006-kube-api-access-qxsdb\") pod \"speaker-tddpp\" (UID: \"3b0c98e8-df1b-485b-972d-2e2ff8103006\") " pod="metallb-system/speaker-tddpp" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095675 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22afba2f-88ba-4b65-8f98-a024f676b896-cert\") pod \"frr-k8s-webhook-server-6998585d5-fchts\" (UID: \"22afba2f-88ba-4b65-8f98-a024f676b896\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-fchts" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095731 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfhfc\" (UniqueName: \"kubernetes.io/projected/22afba2f-88ba-4b65-8f98-a024f676b896-kube-api-access-kfhfc\") pod \"frr-k8s-webhook-server-6998585d5-fchts\" (UID: \"22afba2f-88ba-4b65-8f98-a024f676b896\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-fchts" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095752 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b0c98e8-df1b-485b-972d-2e2ff8103006-metrics-certs\") pod \"speaker-tddpp\" (UID: \"3b0c98e8-df1b-485b-972d-2e2ff8103006\") " pod="metallb-system/speaker-tddpp" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095782 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54e0a364-1f3d-493b-8c11-2d59672a99e1-metrics-certs\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095852 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/54e0a364-1f3d-493b-8c11-2d59672a99e1-frr-sockets\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095914 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z2c5\" (UniqueName: \"kubernetes.io/projected/54e0a364-1f3d-493b-8c11-2d59672a99e1-kube-api-access-4z2c5\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095942 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/54e0a364-1f3d-493b-8c11-2d59672a99e1-frr-conf\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095977 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdd4p\" (UniqueName: \"kubernetes.io/projected/e3ae4470-f488-4bc7-b9e0-a37903b5400a-kube-api-access-fdd4p\") pod \"controller-6c7b4b5f48-7rql8\" (UID: \"e3ae4470-f488-4bc7-b9e0-a37903b5400a\") " pod="metallb-system/controller-6c7b4b5f48-7rql8" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.095997 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/54e0a364-1f3d-493b-8c11-2d59672a99e1-frr-startup\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.197955 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3b0c98e8-df1b-485b-972d-2e2ff8103006-metallb-excludel2\") pod \"speaker-tddpp\" (UID: \"3b0c98e8-df1b-485b-972d-2e2ff8103006\") " pod="metallb-system/speaker-tddpp" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198000 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/54e0a364-1f3d-493b-8c11-2d59672a99e1-metrics\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198046 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ae4470-f488-4bc7-b9e0-a37903b5400a-metrics-certs\") pod \"controller-6c7b4b5f48-7rql8\" (UID: \"e3ae4470-f488-4bc7-b9e0-a37903b5400a\") " pod="metallb-system/controller-6c7b4b5f48-7rql8" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198072 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b0c98e8-df1b-485b-972d-2e2ff8103006-memberlist\") pod \"speaker-tddpp\" (UID: \"3b0c98e8-df1b-485b-972d-2e2ff8103006\") " pod="metallb-system/speaker-tddpp" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198116 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3ae4470-f488-4bc7-b9e0-a37903b5400a-cert\") pod \"controller-6c7b4b5f48-7rql8\" (UID: \"e3ae4470-f488-4bc7-b9e0-a37903b5400a\") " pod="metallb-system/controller-6c7b4b5f48-7rql8" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198162 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxsdb\" (UniqueName: \"kubernetes.io/projected/3b0c98e8-df1b-485b-972d-2e2ff8103006-kube-api-access-qxsdb\") pod \"speaker-tddpp\" (UID: \"3b0c98e8-df1b-485b-972d-2e2ff8103006\") " pod="metallb-system/speaker-tddpp" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198185 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22afba2f-88ba-4b65-8f98-a024f676b896-cert\") pod \"frr-k8s-webhook-server-6998585d5-fchts\" (UID: \"22afba2f-88ba-4b65-8f98-a024f676b896\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-fchts" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198209 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfhfc\" (UniqueName: \"kubernetes.io/projected/22afba2f-88ba-4b65-8f98-a024f676b896-kube-api-access-kfhfc\") pod \"frr-k8s-webhook-server-6998585d5-fchts\" (UID: \"22afba2f-88ba-4b65-8f98-a024f676b896\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-fchts" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198229 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b0c98e8-df1b-485b-972d-2e2ff8103006-metrics-certs\") pod \"speaker-tddpp\" (UID: \"3b0c98e8-df1b-485b-972d-2e2ff8103006\") " pod="metallb-system/speaker-tddpp" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198257 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54e0a364-1f3d-493b-8c11-2d59672a99e1-metrics-certs\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198282 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/54e0a364-1f3d-493b-8c11-2d59672a99e1-frr-sockets\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198304 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z2c5\" (UniqueName: \"kubernetes.io/projected/54e0a364-1f3d-493b-8c11-2d59672a99e1-kube-api-access-4z2c5\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198319 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/54e0a364-1f3d-493b-8c11-2d59672a99e1-frr-conf\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198341 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdd4p\" (UniqueName: \"kubernetes.io/projected/e3ae4470-f488-4bc7-b9e0-a37903b5400a-kube-api-access-fdd4p\") pod \"controller-6c7b4b5f48-7rql8\" (UID: \"e3ae4470-f488-4bc7-b9e0-a37903b5400a\") " pod="metallb-system/controller-6c7b4b5f48-7rql8" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198355 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/54e0a364-1f3d-493b-8c11-2d59672a99e1-frr-startup\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198371 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/54e0a364-1f3d-493b-8c11-2d59672a99e1-reloader\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: E1124 09:14:55.198456 4563 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 24 09:14:55 crc kubenswrapper[4563]: E1124 09:14:55.198568 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b0c98e8-df1b-485b-972d-2e2ff8103006-memberlist podName:3b0c98e8-df1b-485b-972d-2e2ff8103006 nodeName:}" failed. No retries permitted until 2025-11-24 09:14:55.698536907 +0000 UTC m=+672.957514364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3b0c98e8-df1b-485b-972d-2e2ff8103006-memberlist") pod "speaker-tddpp" (UID: "3b0c98e8-df1b-485b-972d-2e2ff8103006") : secret "metallb-memberlist" not found Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.198808 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3b0c98e8-df1b-485b-972d-2e2ff8103006-metallb-excludel2\") pod \"speaker-tddpp\" (UID: \"3b0c98e8-df1b-485b-972d-2e2ff8103006\") " pod="metallb-system/speaker-tddpp" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.199395 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/54e0a364-1f3d-493b-8c11-2d59672a99e1-metrics\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.199613 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/54e0a364-1f3d-493b-8c11-2d59672a99e1-reloader\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.199713 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/54e0a364-1f3d-493b-8c11-2d59672a99e1-frr-sockets\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.199824 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/54e0a364-1f3d-493b-8c11-2d59672a99e1-frr-conf\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.200202 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/54e0a364-1f3d-493b-8c11-2d59672a99e1-frr-startup\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.200874 4563 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.204950 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b0c98e8-df1b-485b-972d-2e2ff8103006-metrics-certs\") pod \"speaker-tddpp\" (UID: \"3b0c98e8-df1b-485b-972d-2e2ff8103006\") " pod="metallb-system/speaker-tddpp" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.204990 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54e0a364-1f3d-493b-8c11-2d59672a99e1-metrics-certs\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.205495 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22afba2f-88ba-4b65-8f98-a024f676b896-cert\") pod \"frr-k8s-webhook-server-6998585d5-fchts\" (UID: \"22afba2f-88ba-4b65-8f98-a024f676b896\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-fchts" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.206429 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3ae4470-f488-4bc7-b9e0-a37903b5400a-metrics-certs\") pod \"controller-6c7b4b5f48-7rql8\" (UID: \"e3ae4470-f488-4bc7-b9e0-a37903b5400a\") " pod="metallb-system/controller-6c7b4b5f48-7rql8" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.211945 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3ae4470-f488-4bc7-b9e0-a37903b5400a-cert\") pod \"controller-6c7b4b5f48-7rql8\" (UID: \"e3ae4470-f488-4bc7-b9e0-a37903b5400a\") " pod="metallb-system/controller-6c7b4b5f48-7rql8" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.213291 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdd4p\" (UniqueName: \"kubernetes.io/projected/e3ae4470-f488-4bc7-b9e0-a37903b5400a-kube-api-access-fdd4p\") pod \"controller-6c7b4b5f48-7rql8\" (UID: \"e3ae4470-f488-4bc7-b9e0-a37903b5400a\") " pod="metallb-system/controller-6c7b4b5f48-7rql8" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.213990 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxsdb\" (UniqueName: \"kubernetes.io/projected/3b0c98e8-df1b-485b-972d-2e2ff8103006-kube-api-access-qxsdb\") pod \"speaker-tddpp\" (UID: \"3b0c98e8-df1b-485b-972d-2e2ff8103006\") " pod="metallb-system/speaker-tddpp" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.214240 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfhfc\" (UniqueName: \"kubernetes.io/projected/22afba2f-88ba-4b65-8f98-a024f676b896-kube-api-access-kfhfc\") pod \"frr-k8s-webhook-server-6998585d5-fchts\" (UID: \"22afba2f-88ba-4b65-8f98-a024f676b896\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-fchts" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.214905 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z2c5\" (UniqueName: \"kubernetes.io/projected/54e0a364-1f3d-493b-8c11-2d59672a99e1-kube-api-access-4z2c5\") pod \"frr-k8s-qwb5s\" (UID: \"54e0a364-1f3d-493b-8c11-2d59672a99e1\") " pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.237620 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.244779 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-fchts" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.301040 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-7rql8" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.426106 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-fchts"] Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.478422 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-7rql8"] Nov 24 09:14:55 crc kubenswrapper[4563]: W1124 09:14:55.479312 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ae4470_f488_4bc7_b9e0_a37903b5400a.slice/crio-18730747d906ba0de4b2be9fc415863ad317b2fb68a4a9d66eb27d8eb82ebd89 WatchSource:0}: Error finding container 18730747d906ba0de4b2be9fc415863ad317b2fb68a4a9d66eb27d8eb82ebd89: Status 404 returned error can't find the container with id 18730747d906ba0de4b2be9fc415863ad317b2fb68a4a9d66eb27d8eb82ebd89 Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.480669 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qwb5s" event={"ID":"54e0a364-1f3d-493b-8c11-2d59672a99e1","Type":"ContainerStarted","Data":"77d53330859158d74496bc6d6f6ce3fd2a32b53353e2f698df4d15fb71258b66"} Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.481826 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-fchts" event={"ID":"22afba2f-88ba-4b65-8f98-a024f676b896","Type":"ContainerStarted","Data":"1da1419ff33a8df005dabdfb97c30e85ce74f5cd4720428d3a4880b6b00878a2"} Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.708771 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b0c98e8-df1b-485b-972d-2e2ff8103006-memberlist\") pod \"speaker-tddpp\" (UID: \"3b0c98e8-df1b-485b-972d-2e2ff8103006\") " pod="metallb-system/speaker-tddpp" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.714906 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b0c98e8-df1b-485b-972d-2e2ff8103006-memberlist\") pod \"speaker-tddpp\" (UID: \"3b0c98e8-df1b-485b-972d-2e2ff8103006\") " pod="metallb-system/speaker-tddpp" Nov 24 09:14:55 crc kubenswrapper[4563]: I1124 09:14:55.888807 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tddpp" Nov 24 09:14:55 crc kubenswrapper[4563]: W1124 09:14:55.909101 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b0c98e8_df1b_485b_972d_2e2ff8103006.slice/crio-4fdc162e3ba7f19147cddf3d9aa00a6ff6f6306346716f16c6423c081db53baa WatchSource:0}: Error finding container 4fdc162e3ba7f19147cddf3d9aa00a6ff6f6306346716f16c6423c081db53baa: Status 404 returned error can't find the container with id 4fdc162e3ba7f19147cddf3d9aa00a6ff6f6306346716f16c6423c081db53baa Nov 24 09:14:56 crc kubenswrapper[4563]: I1124 09:14:56.488281 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tddpp" event={"ID":"3b0c98e8-df1b-485b-972d-2e2ff8103006","Type":"ContainerStarted","Data":"8fc97d1d2710165db147864f4eb1ac31b902bc60e9f3de424c1f9e65f3246b3f"} Nov 24 09:14:56 crc kubenswrapper[4563]: I1124 09:14:56.488665 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tddpp" event={"ID":"3b0c98e8-df1b-485b-972d-2e2ff8103006","Type":"ContainerStarted","Data":"4d745a21c8b69b873cc71080a5bcde841f9534f92bdb2d220db503572a0e3372"} Nov 24 09:14:56 crc kubenswrapper[4563]: I1124 09:14:56.488679 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tddpp" event={"ID":"3b0c98e8-df1b-485b-972d-2e2ff8103006","Type":"ContainerStarted","Data":"4fdc162e3ba7f19147cddf3d9aa00a6ff6f6306346716f16c6423c081db53baa"} Nov 24 09:14:56 crc kubenswrapper[4563]: I1124 09:14:56.488846 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tddpp" Nov 24 09:14:56 crc kubenswrapper[4563]: I1124 09:14:56.489622 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-7rql8" event={"ID":"e3ae4470-f488-4bc7-b9e0-a37903b5400a","Type":"ContainerStarted","Data":"29d4bff1ae4e3cc73f93815241a64399604383175957afa0c632fb3fa5ffe595"} Nov 24 09:14:56 crc kubenswrapper[4563]: I1124 09:14:56.489674 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-7rql8" event={"ID":"e3ae4470-f488-4bc7-b9e0-a37903b5400a","Type":"ContainerStarted","Data":"b971036770896348e00bb3655c783e5056329abf9e259907b4ba80176b29dbe9"} Nov 24 09:14:56 crc kubenswrapper[4563]: I1124 09:14:56.489688 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-7rql8" event={"ID":"e3ae4470-f488-4bc7-b9e0-a37903b5400a","Type":"ContainerStarted","Data":"18730747d906ba0de4b2be9fc415863ad317b2fb68a4a9d66eb27d8eb82ebd89"} Nov 24 09:14:56 crc kubenswrapper[4563]: I1124 09:14:56.489863 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-7rql8" Nov 24 09:14:56 crc kubenswrapper[4563]: I1124 09:14:56.505679 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tddpp" podStartSLOduration=2.505667544 podStartE2EDuration="2.505667544s" podCreationTimestamp="2025-11-24 09:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:14:56.501745589 +0000 UTC m=+673.760723037" watchObservedRunningTime="2025-11-24 09:14:56.505667544 +0000 UTC m=+673.764644992" Nov 24 09:14:56 crc kubenswrapper[4563]: I1124 09:14:56.517289 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-7rql8" podStartSLOduration=2.517260565 podStartE2EDuration="2.517260565s" podCreationTimestamp="2025-11-24 09:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:14:56.515494385 +0000 UTC m=+673.774471832" watchObservedRunningTime="2025-11-24 09:14:56.517260565 +0000 UTC m=+673.776238013" Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.130604 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t"] Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.131834 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.134294 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.134377 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.142091 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t"] Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.282249 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92446a78-c0f0-433a-a410-9414ceb0a78d-secret-volume\") pod \"collect-profiles-29399595-p4k4t\" (UID: \"92446a78-c0f0-433a-a410-9414ceb0a78d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.282401 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92446a78-c0f0-433a-a410-9414ceb0a78d-config-volume\") pod \"collect-profiles-29399595-p4k4t\" (UID: \"92446a78-c0f0-433a-a410-9414ceb0a78d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.282427 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhjx2\" (UniqueName: \"kubernetes.io/projected/92446a78-c0f0-433a-a410-9414ceb0a78d-kube-api-access-hhjx2\") pod \"collect-profiles-29399595-p4k4t\" (UID: \"92446a78-c0f0-433a-a410-9414ceb0a78d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.384006 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92446a78-c0f0-433a-a410-9414ceb0a78d-secret-volume\") pod \"collect-profiles-29399595-p4k4t\" (UID: \"92446a78-c0f0-433a-a410-9414ceb0a78d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.384084 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92446a78-c0f0-433a-a410-9414ceb0a78d-config-volume\") pod \"collect-profiles-29399595-p4k4t\" (UID: \"92446a78-c0f0-433a-a410-9414ceb0a78d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.384137 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhjx2\" (UniqueName: \"kubernetes.io/projected/92446a78-c0f0-433a-a410-9414ceb0a78d-kube-api-access-hhjx2\") pod \"collect-profiles-29399595-p4k4t\" (UID: \"92446a78-c0f0-433a-a410-9414ceb0a78d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.385054 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92446a78-c0f0-433a-a410-9414ceb0a78d-config-volume\") pod \"collect-profiles-29399595-p4k4t\" (UID: \"92446a78-c0f0-433a-a410-9414ceb0a78d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.392146 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92446a78-c0f0-433a-a410-9414ceb0a78d-secret-volume\") pod \"collect-profiles-29399595-p4k4t\" (UID: \"92446a78-c0f0-433a-a410-9414ceb0a78d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.397942 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhjx2\" (UniqueName: \"kubernetes.io/projected/92446a78-c0f0-433a-a410-9414ceb0a78d-kube-api-access-hhjx2\") pod \"collect-profiles-29399595-p4k4t\" (UID: \"92446a78-c0f0-433a-a410-9414ceb0a78d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" Nov 24 09:15:00 crc kubenswrapper[4563]: I1124 09:15:00.456686 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" Nov 24 09:15:02 crc kubenswrapper[4563]: I1124 09:15:02.168775 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t"] Nov 24 09:15:02 crc kubenswrapper[4563]: W1124 09:15:02.173239 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92446a78_c0f0_433a_a410_9414ceb0a78d.slice/crio-dba792bf80fabfcd23111c33137247141cad04f9b124e801c5014763c59d8745 WatchSource:0}: Error finding container dba792bf80fabfcd23111c33137247141cad04f9b124e801c5014763c59d8745: Status 404 returned error can't find the container with id dba792bf80fabfcd23111c33137247141cad04f9b124e801c5014763c59d8745 Nov 24 09:15:02 crc kubenswrapper[4563]: I1124 09:15:02.543859 4563 generic.go:334] "Generic (PLEG): container finished" podID="92446a78-c0f0-433a-a410-9414ceb0a78d" containerID="56039f2995c9760adca7d3fd0030414b8abaccbe79f400d82a567adeff990325" exitCode=0 Nov 24 09:15:02 crc kubenswrapper[4563]: I1124 09:15:02.543979 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" event={"ID":"92446a78-c0f0-433a-a410-9414ceb0a78d","Type":"ContainerDied","Data":"56039f2995c9760adca7d3fd0030414b8abaccbe79f400d82a567adeff990325"} Nov 24 09:15:02 crc kubenswrapper[4563]: I1124 09:15:02.544378 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" event={"ID":"92446a78-c0f0-433a-a410-9414ceb0a78d","Type":"ContainerStarted","Data":"dba792bf80fabfcd23111c33137247141cad04f9b124e801c5014763c59d8745"} Nov 24 09:15:02 crc kubenswrapper[4563]: I1124 09:15:02.546377 4563 generic.go:334] "Generic (PLEG): container finished" podID="54e0a364-1f3d-493b-8c11-2d59672a99e1" containerID="beef420197b66b249e229852b24952fdb4dd237caf4ddb2ed976032fefe96302" exitCode=0 Nov 24 09:15:02 crc kubenswrapper[4563]: I1124 09:15:02.546461 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qwb5s" event={"ID":"54e0a364-1f3d-493b-8c11-2d59672a99e1","Type":"ContainerDied","Data":"beef420197b66b249e229852b24952fdb4dd237caf4ddb2ed976032fefe96302"} Nov 24 09:15:02 crc kubenswrapper[4563]: I1124 09:15:02.547969 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-fchts" event={"ID":"22afba2f-88ba-4b65-8f98-a024f676b896","Type":"ContainerStarted","Data":"d6f960771ff8a6f3f61950aafa4b205a2c5c12891ce62e1e25b5b84269bc177e"} Nov 24 09:15:02 crc kubenswrapper[4563]: I1124 09:15:02.548179 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-fchts" Nov 24 09:15:02 crc kubenswrapper[4563]: I1124 09:15:02.594457 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-fchts" podStartSLOduration=2.162423986 podStartE2EDuration="8.594436401s" podCreationTimestamp="2025-11-24 09:14:54 +0000 UTC" firstStartedPulling="2025-11-24 09:14:55.427966893 +0000 UTC m=+672.686944341" lastFinishedPulling="2025-11-24 09:15:01.859979308 +0000 UTC m=+679.118956756" observedRunningTime="2025-11-24 09:15:02.590857923 +0000 UTC m=+679.849835369" watchObservedRunningTime="2025-11-24 09:15:02.594436401 +0000 UTC m=+679.853413848" Nov 24 09:15:03 crc kubenswrapper[4563]: I1124 09:15:03.554458 4563 generic.go:334] "Generic (PLEG): container finished" podID="54e0a364-1f3d-493b-8c11-2d59672a99e1" containerID="9d4cc15571afd3aa754a1cc1358a26d27b349864ca0cca098e9e5790cd9303fe" exitCode=0 Nov 24 09:15:03 crc kubenswrapper[4563]: I1124 09:15:03.555546 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qwb5s" event={"ID":"54e0a364-1f3d-493b-8c11-2d59672a99e1","Type":"ContainerDied","Data":"9d4cc15571afd3aa754a1cc1358a26d27b349864ca0cca098e9e5790cd9303fe"} Nov 24 09:15:03 crc kubenswrapper[4563]: I1124 09:15:03.763440 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" Nov 24 09:15:03 crc kubenswrapper[4563]: I1124 09:15:03.937377 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhjx2\" (UniqueName: \"kubernetes.io/projected/92446a78-c0f0-433a-a410-9414ceb0a78d-kube-api-access-hhjx2\") pod \"92446a78-c0f0-433a-a410-9414ceb0a78d\" (UID: \"92446a78-c0f0-433a-a410-9414ceb0a78d\") " Nov 24 09:15:03 crc kubenswrapper[4563]: I1124 09:15:03.937507 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92446a78-c0f0-433a-a410-9414ceb0a78d-secret-volume\") pod \"92446a78-c0f0-433a-a410-9414ceb0a78d\" (UID: \"92446a78-c0f0-433a-a410-9414ceb0a78d\") " Nov 24 09:15:03 crc kubenswrapper[4563]: I1124 09:15:03.937541 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92446a78-c0f0-433a-a410-9414ceb0a78d-config-volume\") pod \"92446a78-c0f0-433a-a410-9414ceb0a78d\" (UID: \"92446a78-c0f0-433a-a410-9414ceb0a78d\") " Nov 24 09:15:03 crc kubenswrapper[4563]: I1124 09:15:03.938305 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92446a78-c0f0-433a-a410-9414ceb0a78d-config-volume" (OuterVolumeSpecName: "config-volume") pod "92446a78-c0f0-433a-a410-9414ceb0a78d" (UID: "92446a78-c0f0-433a-a410-9414ceb0a78d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:15:03 crc kubenswrapper[4563]: I1124 09:15:03.943464 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92446a78-c0f0-433a-a410-9414ceb0a78d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "92446a78-c0f0-433a-a410-9414ceb0a78d" (UID: "92446a78-c0f0-433a-a410-9414ceb0a78d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:15:03 crc kubenswrapper[4563]: I1124 09:15:03.943625 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92446a78-c0f0-433a-a410-9414ceb0a78d-kube-api-access-hhjx2" (OuterVolumeSpecName: "kube-api-access-hhjx2") pod "92446a78-c0f0-433a-a410-9414ceb0a78d" (UID: "92446a78-c0f0-433a-a410-9414ceb0a78d"). InnerVolumeSpecName "kube-api-access-hhjx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:15:04 crc kubenswrapper[4563]: I1124 09:15:04.039840 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhjx2\" (UniqueName: \"kubernetes.io/projected/92446a78-c0f0-433a-a410-9414ceb0a78d-kube-api-access-hhjx2\") on node \"crc\" DevicePath \"\"" Nov 24 09:15:04 crc kubenswrapper[4563]: I1124 09:15:04.039874 4563 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/92446a78-c0f0-433a-a410-9414ceb0a78d-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 09:15:04 crc kubenswrapper[4563]: I1124 09:15:04.039884 4563 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92446a78-c0f0-433a-a410-9414ceb0a78d-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 09:15:04 crc kubenswrapper[4563]: I1124 09:15:04.563835 4563 generic.go:334] "Generic (PLEG): container finished" podID="54e0a364-1f3d-493b-8c11-2d59672a99e1" containerID="945287337b85556313b641f77fb2927ee3cb9a4feac3ec3785f709d0531ea42f" exitCode=0 Nov 24 09:15:04 crc kubenswrapper[4563]: I1124 09:15:04.563907 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qwb5s" event={"ID":"54e0a364-1f3d-493b-8c11-2d59672a99e1","Type":"ContainerDied","Data":"945287337b85556313b641f77fb2927ee3cb9a4feac3ec3785f709d0531ea42f"} Nov 24 09:15:04 crc kubenswrapper[4563]: I1124 09:15:04.567212 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" event={"ID":"92446a78-c0f0-433a-a410-9414ceb0a78d","Type":"ContainerDied","Data":"dba792bf80fabfcd23111c33137247141cad04f9b124e801c5014763c59d8745"} Nov 24 09:15:04 crc kubenswrapper[4563]: I1124 09:15:04.567338 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dba792bf80fabfcd23111c33137247141cad04f9b124e801c5014763c59d8745" Nov 24 09:15:04 crc kubenswrapper[4563]: I1124 09:15:04.567278 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t" Nov 24 09:15:05 crc kubenswrapper[4563]: I1124 09:15:05.306250 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-7rql8" Nov 24 09:15:05 crc kubenswrapper[4563]: I1124 09:15:05.587181 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qwb5s" event={"ID":"54e0a364-1f3d-493b-8c11-2d59672a99e1","Type":"ContainerStarted","Data":"330adef9fd62b1c2bc1778b5db0d01adf24e92e8861b4fa9e08fb976af5134d1"} Nov 24 09:15:05 crc kubenswrapper[4563]: I1124 09:15:05.587295 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qwb5s" event={"ID":"54e0a364-1f3d-493b-8c11-2d59672a99e1","Type":"ContainerStarted","Data":"14a53d626d9488a56464ef8779c5703f1ea6ffad205c27e8a8aeac66e1bdb302"} Nov 24 09:15:05 crc kubenswrapper[4563]: I1124 09:15:05.587311 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qwb5s" event={"ID":"54e0a364-1f3d-493b-8c11-2d59672a99e1","Type":"ContainerStarted","Data":"ba89042767949eee883c3c86b2f3f70459cd4a7822feb7721ef50a496a17d9a2"} Nov 24 09:15:05 crc kubenswrapper[4563]: I1124 09:15:05.587322 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qwb5s" event={"ID":"54e0a364-1f3d-493b-8c11-2d59672a99e1","Type":"ContainerStarted","Data":"cd7d4b3ac27c146adbcc1b529a99997e890c2edd00b159d121f61822a3b0b1b4"} Nov 24 09:15:05 crc kubenswrapper[4563]: I1124 09:15:05.587332 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qwb5s" event={"ID":"54e0a364-1f3d-493b-8c11-2d59672a99e1","Type":"ContainerStarted","Data":"637164ccf9417c246a9fa5185499bbfb0a8f06df1baa002f82d2f11c635dc248"} Nov 24 09:15:05 crc kubenswrapper[4563]: I1124 09:15:05.587342 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qwb5s" event={"ID":"54e0a364-1f3d-493b-8c11-2d59672a99e1","Type":"ContainerStarted","Data":"d8287b9d2d33db60ac026f6f5348f91d5fef96196c19d759622710854e9d5b0c"} Nov 24 09:15:05 crc kubenswrapper[4563]: I1124 09:15:05.587478 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:15:05 crc kubenswrapper[4563]: I1124 09:15:05.609781 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-qwb5s" podStartSLOduration=5.123059511 podStartE2EDuration="11.60976569s" podCreationTimestamp="2025-11-24 09:14:54 +0000 UTC" firstStartedPulling="2025-11-24 09:14:55.356009497 +0000 UTC m=+672.614986943" lastFinishedPulling="2025-11-24 09:15:01.842715675 +0000 UTC m=+679.101693122" observedRunningTime="2025-11-24 09:15:05.604416043 +0000 UTC m=+682.863393490" watchObservedRunningTime="2025-11-24 09:15:05.60976569 +0000 UTC m=+682.868743137" Nov 24 09:15:10 crc kubenswrapper[4563]: I1124 09:15:10.238014 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:15:10 crc kubenswrapper[4563]: I1124 09:15:10.270835 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:15:15 crc kubenswrapper[4563]: I1124 09:15:15.242046 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-qwb5s" Nov 24 09:15:15 crc kubenswrapper[4563]: I1124 09:15:15.254733 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-fchts" Nov 24 09:15:15 crc kubenswrapper[4563]: I1124 09:15:15.892564 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tddpp" Nov 24 09:15:17 crc kubenswrapper[4563]: I1124 09:15:17.964552 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nf2f6"] Nov 24 09:15:17 crc kubenswrapper[4563]: E1124 09:15:17.965267 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92446a78-c0f0-433a-a410-9414ceb0a78d" containerName="collect-profiles" Nov 24 09:15:17 crc kubenswrapper[4563]: I1124 09:15:17.965284 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="92446a78-c0f0-433a-a410-9414ceb0a78d" containerName="collect-profiles" Nov 24 09:15:17 crc kubenswrapper[4563]: I1124 09:15:17.965396 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="92446a78-c0f0-433a-a410-9414ceb0a78d" containerName="collect-profiles" Nov 24 09:15:17 crc kubenswrapper[4563]: I1124 09:15:17.965807 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nf2f6" Nov 24 09:15:17 crc kubenswrapper[4563]: I1124 09:15:17.968181 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 24 09:15:17 crc kubenswrapper[4563]: I1124 09:15:17.968565 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 24 09:15:17 crc kubenswrapper[4563]: I1124 09:15:17.968736 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-b5pjt" Nov 24 09:15:18 crc kubenswrapper[4563]: I1124 09:15:18.010617 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nf2f6"] Nov 24 09:15:18 crc kubenswrapper[4563]: I1124 09:15:18.030220 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqw6k\" (UniqueName: \"kubernetes.io/projected/32af54d4-b8bd-4636-83c0-ffacd21003df-kube-api-access-nqw6k\") pod \"openstack-operator-index-nf2f6\" (UID: \"32af54d4-b8bd-4636-83c0-ffacd21003df\") " pod="openstack-operators/openstack-operator-index-nf2f6" Nov 24 09:15:18 crc kubenswrapper[4563]: I1124 09:15:18.131274 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqw6k\" (UniqueName: \"kubernetes.io/projected/32af54d4-b8bd-4636-83c0-ffacd21003df-kube-api-access-nqw6k\") pod \"openstack-operator-index-nf2f6\" (UID: \"32af54d4-b8bd-4636-83c0-ffacd21003df\") " pod="openstack-operators/openstack-operator-index-nf2f6" Nov 24 09:15:18 crc kubenswrapper[4563]: I1124 09:15:18.147943 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqw6k\" (UniqueName: \"kubernetes.io/projected/32af54d4-b8bd-4636-83c0-ffacd21003df-kube-api-access-nqw6k\") pod \"openstack-operator-index-nf2f6\" (UID: \"32af54d4-b8bd-4636-83c0-ffacd21003df\") " pod="openstack-operators/openstack-operator-index-nf2f6" Nov 24 09:15:18 crc kubenswrapper[4563]: I1124 09:15:18.282710 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nf2f6" Nov 24 09:15:18 crc kubenswrapper[4563]: I1124 09:15:18.649038 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nf2f6"] Nov 24 09:15:18 crc kubenswrapper[4563]: W1124 09:15:18.658752 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32af54d4_b8bd_4636_83c0_ffacd21003df.slice/crio-a5a7cbea57a0ab690735613cfc6ac00df2ef178761b38c0e5426e586a13b91fb WatchSource:0}: Error finding container a5a7cbea57a0ab690735613cfc6ac00df2ef178761b38c0e5426e586a13b91fb: Status 404 returned error can't find the container with id a5a7cbea57a0ab690735613cfc6ac00df2ef178761b38c0e5426e586a13b91fb Nov 24 09:15:19 crc kubenswrapper[4563]: I1124 09:15:19.675623 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nf2f6" event={"ID":"32af54d4-b8bd-4636-83c0-ffacd21003df","Type":"ContainerStarted","Data":"a5a7cbea57a0ab690735613cfc6ac00df2ef178761b38c0e5426e586a13b91fb"} Nov 24 09:15:20 crc kubenswrapper[4563]: I1124 09:15:20.679876 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nf2f6" event={"ID":"32af54d4-b8bd-4636-83c0-ffacd21003df","Type":"ContainerStarted","Data":"e39679ac3bc98f9f23a18b69a77c1d5f167e2b175de87d369a0294cdcad99c17"} Nov 24 09:15:20 crc kubenswrapper[4563]: I1124 09:15:20.692055 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nf2f6" podStartSLOduration=2.292712202 podStartE2EDuration="3.692043228s" podCreationTimestamp="2025-11-24 09:15:17 +0000 UTC" firstStartedPulling="2025-11-24 09:15:18.667162521 +0000 UTC m=+695.926139968" lastFinishedPulling="2025-11-24 09:15:20.066493547 +0000 UTC m=+697.325470994" observedRunningTime="2025-11-24 09:15:20.689213441 +0000 UTC m=+697.948190888" watchObservedRunningTime="2025-11-24 09:15:20.692043228 +0000 UTC m=+697.951020674" Nov 24 09:15:21 crc kubenswrapper[4563]: I1124 09:15:21.343329 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nf2f6"] Nov 24 09:15:21 crc kubenswrapper[4563]: I1124 09:15:21.947099 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5f5wm"] Nov 24 09:15:21 crc kubenswrapper[4563]: I1124 09:15:21.947883 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5f5wm" Nov 24 09:15:21 crc kubenswrapper[4563]: I1124 09:15:21.955909 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5f5wm"] Nov 24 09:15:21 crc kubenswrapper[4563]: I1124 09:15:21.980024 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhmkf\" (UniqueName: \"kubernetes.io/projected/0079c598-0bc4-4809-9813-0aa163a961a1-kube-api-access-fhmkf\") pod \"openstack-operator-index-5f5wm\" (UID: \"0079c598-0bc4-4809-9813-0aa163a961a1\") " pod="openstack-operators/openstack-operator-index-5f5wm" Nov 24 09:15:22 crc kubenswrapper[4563]: I1124 09:15:22.081363 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhmkf\" (UniqueName: \"kubernetes.io/projected/0079c598-0bc4-4809-9813-0aa163a961a1-kube-api-access-fhmkf\") pod \"openstack-operator-index-5f5wm\" (UID: \"0079c598-0bc4-4809-9813-0aa163a961a1\") " pod="openstack-operators/openstack-operator-index-5f5wm" Nov 24 09:15:22 crc kubenswrapper[4563]: I1124 09:15:22.097963 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhmkf\" (UniqueName: \"kubernetes.io/projected/0079c598-0bc4-4809-9813-0aa163a961a1-kube-api-access-fhmkf\") pod \"openstack-operator-index-5f5wm\" (UID: \"0079c598-0bc4-4809-9813-0aa163a961a1\") " pod="openstack-operators/openstack-operator-index-5f5wm" Nov 24 09:15:22 crc kubenswrapper[4563]: I1124 09:15:22.261934 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5f5wm" Nov 24 09:15:22 crc kubenswrapper[4563]: I1124 09:15:22.631835 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5f5wm"] Nov 24 09:15:22 crc kubenswrapper[4563]: W1124 09:15:22.638368 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0079c598_0bc4_4809_9813_0aa163a961a1.slice/crio-584730afe659a817e97f5747c291010f056628b6125cde05239eac46e9a955e0 WatchSource:0}: Error finding container 584730afe659a817e97f5747c291010f056628b6125cde05239eac46e9a955e0: Status 404 returned error can't find the container with id 584730afe659a817e97f5747c291010f056628b6125cde05239eac46e9a955e0 Nov 24 09:15:22 crc kubenswrapper[4563]: I1124 09:15:22.691182 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5f5wm" event={"ID":"0079c598-0bc4-4809-9813-0aa163a961a1","Type":"ContainerStarted","Data":"584730afe659a817e97f5747c291010f056628b6125cde05239eac46e9a955e0"} Nov 24 09:15:22 crc kubenswrapper[4563]: I1124 09:15:22.691280 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-nf2f6" podUID="32af54d4-b8bd-4636-83c0-ffacd21003df" containerName="registry-server" containerID="cri-o://e39679ac3bc98f9f23a18b69a77c1d5f167e2b175de87d369a0294cdcad99c17" gracePeriod=2 Nov 24 09:15:22 crc kubenswrapper[4563]: I1124 09:15:22.981385 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nf2f6" Nov 24 09:15:23 crc kubenswrapper[4563]: I1124 09:15:23.094284 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqw6k\" (UniqueName: \"kubernetes.io/projected/32af54d4-b8bd-4636-83c0-ffacd21003df-kube-api-access-nqw6k\") pod \"32af54d4-b8bd-4636-83c0-ffacd21003df\" (UID: \"32af54d4-b8bd-4636-83c0-ffacd21003df\") " Nov 24 09:15:23 crc kubenswrapper[4563]: I1124 09:15:23.099367 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32af54d4-b8bd-4636-83c0-ffacd21003df-kube-api-access-nqw6k" (OuterVolumeSpecName: "kube-api-access-nqw6k") pod "32af54d4-b8bd-4636-83c0-ffacd21003df" (UID: "32af54d4-b8bd-4636-83c0-ffacd21003df"). InnerVolumeSpecName "kube-api-access-nqw6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:15:23 crc kubenswrapper[4563]: I1124 09:15:23.196238 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqw6k\" (UniqueName: \"kubernetes.io/projected/32af54d4-b8bd-4636-83c0-ffacd21003df-kube-api-access-nqw6k\") on node \"crc\" DevicePath \"\"" Nov 24 09:15:23 crc kubenswrapper[4563]: I1124 09:15:23.697366 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5f5wm" event={"ID":"0079c598-0bc4-4809-9813-0aa163a961a1","Type":"ContainerStarted","Data":"23d431c961864f465a02836ca1ae1a9967bd09521080499dcfce36b2ca129b64"} Nov 24 09:15:23 crc kubenswrapper[4563]: I1124 09:15:23.698827 4563 generic.go:334] "Generic (PLEG): container finished" podID="32af54d4-b8bd-4636-83c0-ffacd21003df" containerID="e39679ac3bc98f9f23a18b69a77c1d5f167e2b175de87d369a0294cdcad99c17" exitCode=0 Nov 24 09:15:23 crc kubenswrapper[4563]: I1124 09:15:23.698866 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nf2f6" event={"ID":"32af54d4-b8bd-4636-83c0-ffacd21003df","Type":"ContainerDied","Data":"e39679ac3bc98f9f23a18b69a77c1d5f167e2b175de87d369a0294cdcad99c17"} Nov 24 09:15:23 crc kubenswrapper[4563]: I1124 09:15:23.698892 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nf2f6" event={"ID":"32af54d4-b8bd-4636-83c0-ffacd21003df","Type":"ContainerDied","Data":"a5a7cbea57a0ab690735613cfc6ac00df2ef178761b38c0e5426e586a13b91fb"} Nov 24 09:15:23 crc kubenswrapper[4563]: I1124 09:15:23.698911 4563 scope.go:117] "RemoveContainer" containerID="e39679ac3bc98f9f23a18b69a77c1d5f167e2b175de87d369a0294cdcad99c17" Nov 24 09:15:23 crc kubenswrapper[4563]: I1124 09:15:23.698942 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nf2f6" Nov 24 09:15:23 crc kubenswrapper[4563]: I1124 09:15:23.718492 4563 scope.go:117] "RemoveContainer" containerID="e39679ac3bc98f9f23a18b69a77c1d5f167e2b175de87d369a0294cdcad99c17" Nov 24 09:15:23 crc kubenswrapper[4563]: E1124 09:15:23.718859 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e39679ac3bc98f9f23a18b69a77c1d5f167e2b175de87d369a0294cdcad99c17\": container with ID starting with e39679ac3bc98f9f23a18b69a77c1d5f167e2b175de87d369a0294cdcad99c17 not found: ID does not exist" containerID="e39679ac3bc98f9f23a18b69a77c1d5f167e2b175de87d369a0294cdcad99c17" Nov 24 09:15:23 crc kubenswrapper[4563]: I1124 09:15:23.718909 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e39679ac3bc98f9f23a18b69a77c1d5f167e2b175de87d369a0294cdcad99c17"} err="failed to get container status \"e39679ac3bc98f9f23a18b69a77c1d5f167e2b175de87d369a0294cdcad99c17\": rpc error: code = NotFound desc = could not find container \"e39679ac3bc98f9f23a18b69a77c1d5f167e2b175de87d369a0294cdcad99c17\": container with ID starting with e39679ac3bc98f9f23a18b69a77c1d5f167e2b175de87d369a0294cdcad99c17 not found: ID does not exist" Nov 24 09:15:23 crc kubenswrapper[4563]: I1124 09:15:23.721431 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5f5wm" podStartSLOduration=2.231473607 podStartE2EDuration="2.721418554s" podCreationTimestamp="2025-11-24 09:15:21 +0000 UTC" firstStartedPulling="2025-11-24 09:15:22.642816342 +0000 UTC m=+699.901793789" lastFinishedPulling="2025-11-24 09:15:23.132761289 +0000 UTC m=+700.391738736" observedRunningTime="2025-11-24 09:15:23.715014468 +0000 UTC m=+700.973991935" watchObservedRunningTime="2025-11-24 09:15:23.721418554 +0000 UTC m=+700.980396001" Nov 24 09:15:23 crc kubenswrapper[4563]: I1124 09:15:23.726122 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nf2f6"] Nov 24 09:15:23 crc kubenswrapper[4563]: I1124 09:15:23.728596 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-nf2f6"] Nov 24 09:15:25 crc kubenswrapper[4563]: I1124 09:15:25.061345 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32af54d4-b8bd-4636-83c0-ffacd21003df" path="/var/lib/kubelet/pods/32af54d4-b8bd-4636-83c0-ffacd21003df/volumes" Nov 24 09:15:32 crc kubenswrapper[4563]: I1124 09:15:32.262728 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-5f5wm" Nov 24 09:15:32 crc kubenswrapper[4563]: I1124 09:15:32.263129 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-5f5wm" Nov 24 09:15:32 crc kubenswrapper[4563]: I1124 09:15:32.288998 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-5f5wm" Nov 24 09:15:32 crc kubenswrapper[4563]: I1124 09:15:32.791764 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-5f5wm" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.388607 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d"] Nov 24 09:15:39 crc kubenswrapper[4563]: E1124 09:15:39.389276 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32af54d4-b8bd-4636-83c0-ffacd21003df" containerName="registry-server" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.389289 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="32af54d4-b8bd-4636-83c0-ffacd21003df" containerName="registry-server" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.389426 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="32af54d4-b8bd-4636-83c0-ffacd21003df" containerName="registry-server" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.390096 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.394248 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-79h9l" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.396986 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d"] Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.404954 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d314d9f-2d34-4e4c-899e-a113c55ad0df-util\") pod \"1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d\" (UID: \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\") " pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.405026 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d314d9f-2d34-4e4c-899e-a113c55ad0df-bundle\") pod \"1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d\" (UID: \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\") " pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.405059 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trvtb\" (UniqueName: \"kubernetes.io/projected/8d314d9f-2d34-4e4c-899e-a113c55ad0df-kube-api-access-trvtb\") pod \"1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d\" (UID: \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\") " pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.506301 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d314d9f-2d34-4e4c-899e-a113c55ad0df-bundle\") pod \"1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d\" (UID: \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\") " pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.506369 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trvtb\" (UniqueName: \"kubernetes.io/projected/8d314d9f-2d34-4e4c-899e-a113c55ad0df-kube-api-access-trvtb\") pod \"1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d\" (UID: \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\") " pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.506438 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d314d9f-2d34-4e4c-899e-a113c55ad0df-util\") pod \"1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d\" (UID: \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\") " pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.507078 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d314d9f-2d34-4e4c-899e-a113c55ad0df-bundle\") pod \"1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d\" (UID: \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\") " pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.507114 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d314d9f-2d34-4e4c-899e-a113c55ad0df-util\") pod \"1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d\" (UID: \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\") " pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.524445 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trvtb\" (UniqueName: \"kubernetes.io/projected/8d314d9f-2d34-4e4c-899e-a113c55ad0df-kube-api-access-trvtb\") pod \"1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d\" (UID: \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\") " pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" Nov 24 09:15:39 crc kubenswrapper[4563]: I1124 09:15:39.704426 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" Nov 24 09:15:40 crc kubenswrapper[4563]: I1124 09:15:40.066448 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d"] Nov 24 09:15:40 crc kubenswrapper[4563]: W1124 09:15:40.070248 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d314d9f_2d34_4e4c_899e_a113c55ad0df.slice/crio-52fe841e827a1bacd07c8206431c1a5830a73e79a4be2363fd7013531e797c95 WatchSource:0}: Error finding container 52fe841e827a1bacd07c8206431c1a5830a73e79a4be2363fd7013531e797c95: Status 404 returned error can't find the container with id 52fe841e827a1bacd07c8206431c1a5830a73e79a4be2363fd7013531e797c95 Nov 24 09:15:40 crc kubenswrapper[4563]: I1124 09:15:40.811137 4563 generic.go:334] "Generic (PLEG): container finished" podID="8d314d9f-2d34-4e4c-899e-a113c55ad0df" containerID="c873462251ed35219d3535abaec9d162d7a3dfa0f4fae74e21a4c2c3976bd458" exitCode=0 Nov 24 09:15:40 crc kubenswrapper[4563]: I1124 09:15:40.811179 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" event={"ID":"8d314d9f-2d34-4e4c-899e-a113c55ad0df","Type":"ContainerDied","Data":"c873462251ed35219d3535abaec9d162d7a3dfa0f4fae74e21a4c2c3976bd458"} Nov 24 09:15:40 crc kubenswrapper[4563]: I1124 09:15:40.811205 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" event={"ID":"8d314d9f-2d34-4e4c-899e-a113c55ad0df","Type":"ContainerStarted","Data":"52fe841e827a1bacd07c8206431c1a5830a73e79a4be2363fd7013531e797c95"} Nov 24 09:15:41 crc kubenswrapper[4563]: I1124 09:15:41.823801 4563 generic.go:334] "Generic (PLEG): container finished" podID="8d314d9f-2d34-4e4c-899e-a113c55ad0df" containerID="b4606c5c660f3d71bbbabb5842e041cd21814609dc008640e256daae5043f55d" exitCode=0 Nov 24 09:15:41 crc kubenswrapper[4563]: I1124 09:15:41.824017 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" event={"ID":"8d314d9f-2d34-4e4c-899e-a113c55ad0df","Type":"ContainerDied","Data":"b4606c5c660f3d71bbbabb5842e041cd21814609dc008640e256daae5043f55d"} Nov 24 09:15:42 crc kubenswrapper[4563]: I1124 09:15:42.833693 4563 generic.go:334] "Generic (PLEG): container finished" podID="8d314d9f-2d34-4e4c-899e-a113c55ad0df" containerID="782cf2363af9e9d54f13ad0820cdf568595e33a97cd948e6bc8d4ff261ef977e" exitCode=0 Nov 24 09:15:42 crc kubenswrapper[4563]: I1124 09:15:42.833736 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" event={"ID":"8d314d9f-2d34-4e4c-899e-a113c55ad0df","Type":"ContainerDied","Data":"782cf2363af9e9d54f13ad0820cdf568595e33a97cd948e6bc8d4ff261ef977e"} Nov 24 09:15:44 crc kubenswrapper[4563]: I1124 09:15:44.034806 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" Nov 24 09:15:44 crc kubenswrapper[4563]: I1124 09:15:44.161810 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d314d9f-2d34-4e4c-899e-a113c55ad0df-bundle\") pod \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\" (UID: \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\") " Nov 24 09:15:44 crc kubenswrapper[4563]: I1124 09:15:44.161987 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d314d9f-2d34-4e4c-899e-a113c55ad0df-util\") pod \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\" (UID: \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\") " Nov 24 09:15:44 crc kubenswrapper[4563]: I1124 09:15:44.162080 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trvtb\" (UniqueName: \"kubernetes.io/projected/8d314d9f-2d34-4e4c-899e-a113c55ad0df-kube-api-access-trvtb\") pod \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\" (UID: \"8d314d9f-2d34-4e4c-899e-a113c55ad0df\") " Nov 24 09:15:44 crc kubenswrapper[4563]: I1124 09:15:44.163123 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d314d9f-2d34-4e4c-899e-a113c55ad0df-bundle" (OuterVolumeSpecName: "bundle") pod "8d314d9f-2d34-4e4c-899e-a113c55ad0df" (UID: "8d314d9f-2d34-4e4c-899e-a113c55ad0df"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:15:44 crc kubenswrapper[4563]: I1124 09:15:44.167584 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d314d9f-2d34-4e4c-899e-a113c55ad0df-kube-api-access-trvtb" (OuterVolumeSpecName: "kube-api-access-trvtb") pod "8d314d9f-2d34-4e4c-899e-a113c55ad0df" (UID: "8d314d9f-2d34-4e4c-899e-a113c55ad0df"). InnerVolumeSpecName "kube-api-access-trvtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:15:44 crc kubenswrapper[4563]: I1124 09:15:44.171767 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d314d9f-2d34-4e4c-899e-a113c55ad0df-util" (OuterVolumeSpecName: "util") pod "8d314d9f-2d34-4e4c-899e-a113c55ad0df" (UID: "8d314d9f-2d34-4e4c-899e-a113c55ad0df"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:15:44 crc kubenswrapper[4563]: I1124 09:15:44.263177 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trvtb\" (UniqueName: \"kubernetes.io/projected/8d314d9f-2d34-4e4c-899e-a113c55ad0df-kube-api-access-trvtb\") on node \"crc\" DevicePath \"\"" Nov 24 09:15:44 crc kubenswrapper[4563]: I1124 09:15:44.263211 4563 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d314d9f-2d34-4e4c-899e-a113c55ad0df-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:15:44 crc kubenswrapper[4563]: I1124 09:15:44.263221 4563 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d314d9f-2d34-4e4c-899e-a113c55ad0df-util\") on node \"crc\" DevicePath \"\"" Nov 24 09:15:44 crc kubenswrapper[4563]: I1124 09:15:44.849046 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" event={"ID":"8d314d9f-2d34-4e4c-899e-a113c55ad0df","Type":"ContainerDied","Data":"52fe841e827a1bacd07c8206431c1a5830a73e79a4be2363fd7013531e797c95"} Nov 24 09:15:44 crc kubenswrapper[4563]: I1124 09:15:44.849441 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52fe841e827a1bacd07c8206431c1a5830a73e79a4be2363fd7013531e797c95" Nov 24 09:15:44 crc kubenswrapper[4563]: I1124 09:15:44.849155 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d" Nov 24 09:15:47 crc kubenswrapper[4563]: I1124 09:15:47.287749 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8486c7f98b-xz8g7"] Nov 24 09:15:47 crc kubenswrapper[4563]: E1124 09:15:47.288212 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d314d9f-2d34-4e4c-899e-a113c55ad0df" containerName="extract" Nov 24 09:15:47 crc kubenswrapper[4563]: I1124 09:15:47.288226 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d314d9f-2d34-4e4c-899e-a113c55ad0df" containerName="extract" Nov 24 09:15:47 crc kubenswrapper[4563]: E1124 09:15:47.288241 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d314d9f-2d34-4e4c-899e-a113c55ad0df" containerName="util" Nov 24 09:15:47 crc kubenswrapper[4563]: I1124 09:15:47.288246 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d314d9f-2d34-4e4c-899e-a113c55ad0df" containerName="util" Nov 24 09:15:47 crc kubenswrapper[4563]: E1124 09:15:47.288252 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d314d9f-2d34-4e4c-899e-a113c55ad0df" containerName="pull" Nov 24 09:15:47 crc kubenswrapper[4563]: I1124 09:15:47.288259 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d314d9f-2d34-4e4c-899e-a113c55ad0df" containerName="pull" Nov 24 09:15:47 crc kubenswrapper[4563]: I1124 09:15:47.288386 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d314d9f-2d34-4e4c-899e-a113c55ad0df" containerName="extract" Nov 24 09:15:47 crc kubenswrapper[4563]: I1124 09:15:47.288917 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-8486c7f98b-xz8g7" Nov 24 09:15:47 crc kubenswrapper[4563]: I1124 09:15:47.291480 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-ktz7p" Nov 24 09:15:47 crc kubenswrapper[4563]: I1124 09:15:47.315804 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8486c7f98b-xz8g7"] Nov 24 09:15:47 crc kubenswrapper[4563]: I1124 09:15:47.407195 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvdgw\" (UniqueName: \"kubernetes.io/projected/56c65669-5fad-40b3-aec8-b459c3e6b0f8-kube-api-access-rvdgw\") pod \"openstack-operator-controller-operator-8486c7f98b-xz8g7\" (UID: \"56c65669-5fad-40b3-aec8-b459c3e6b0f8\") " pod="openstack-operators/openstack-operator-controller-operator-8486c7f98b-xz8g7" Nov 24 09:15:47 crc kubenswrapper[4563]: I1124 09:15:47.508001 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvdgw\" (UniqueName: \"kubernetes.io/projected/56c65669-5fad-40b3-aec8-b459c3e6b0f8-kube-api-access-rvdgw\") pod \"openstack-operator-controller-operator-8486c7f98b-xz8g7\" (UID: \"56c65669-5fad-40b3-aec8-b459c3e6b0f8\") " pod="openstack-operators/openstack-operator-controller-operator-8486c7f98b-xz8g7" Nov 24 09:15:47 crc kubenswrapper[4563]: I1124 09:15:47.526748 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvdgw\" (UniqueName: \"kubernetes.io/projected/56c65669-5fad-40b3-aec8-b459c3e6b0f8-kube-api-access-rvdgw\") pod \"openstack-operator-controller-operator-8486c7f98b-xz8g7\" (UID: \"56c65669-5fad-40b3-aec8-b459c3e6b0f8\") " pod="openstack-operators/openstack-operator-controller-operator-8486c7f98b-xz8g7" Nov 24 09:15:47 crc kubenswrapper[4563]: I1124 09:15:47.603748 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-8486c7f98b-xz8g7" Nov 24 09:15:48 crc kubenswrapper[4563]: I1124 09:15:48.094056 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8486c7f98b-xz8g7"] Nov 24 09:15:48 crc kubenswrapper[4563]: I1124 09:15:48.875346 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-8486c7f98b-xz8g7" event={"ID":"56c65669-5fad-40b3-aec8-b459c3e6b0f8","Type":"ContainerStarted","Data":"1803add9484803cc27d98d280fda49c815a1a0b15adc136856b69e4daa525acf"} Nov 24 09:15:51 crc kubenswrapper[4563]: I1124 09:15:51.895414 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-8486c7f98b-xz8g7" event={"ID":"56c65669-5fad-40b3-aec8-b459c3e6b0f8","Type":"ContainerStarted","Data":"25d4d57eb11df869df03a34d01d96f01251cd7221fbb475836c7a62346ee78ea"} Nov 24 09:15:53 crc kubenswrapper[4563]: I1124 09:15:53.911042 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-8486c7f98b-xz8g7" event={"ID":"56c65669-5fad-40b3-aec8-b459c3e6b0f8","Type":"ContainerStarted","Data":"5c2e8d0695c4e84ece641f38f6ecdea647a0f1ec06c3e4f10709fb1cbc2eb25b"} Nov 24 09:15:53 crc kubenswrapper[4563]: I1124 09:15:53.911183 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-8486c7f98b-xz8g7" Nov 24 09:15:53 crc kubenswrapper[4563]: I1124 09:15:53.936290 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-8486c7f98b-xz8g7" podStartSLOduration=1.863529786 podStartE2EDuration="6.936274673s" podCreationTimestamp="2025-11-24 09:15:47 +0000 UTC" firstStartedPulling="2025-11-24 09:15:48.084153227 +0000 UTC m=+725.343130674" lastFinishedPulling="2025-11-24 09:15:53.156898114 +0000 UTC m=+730.415875561" observedRunningTime="2025-11-24 09:15:53.932322369 +0000 UTC m=+731.191299816" watchObservedRunningTime="2025-11-24 09:15:53.936274673 +0000 UTC m=+731.195252120" Nov 24 09:15:57 crc kubenswrapper[4563]: I1124 09:15:57.607048 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-8486c7f98b-xz8g7" Nov 24 09:16:11 crc kubenswrapper[4563]: I1124 09:16:11.654551 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g9q5v"] Nov 24 09:16:11 crc kubenswrapper[4563]: I1124 09:16:11.655203 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" podUID="99e63a17-8605-4830-96b4-dd619cf76549" containerName="controller-manager" containerID="cri-o://65d992faf76f9dded47291f2d921cf054d303707053bb4b4aa54580a19aebaac" gracePeriod=30 Nov 24 09:16:11 crc kubenswrapper[4563]: I1124 09:16:11.728099 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd"] Nov 24 09:16:11 crc kubenswrapper[4563]: I1124 09:16:11.728536 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" podUID="e7c61447-7ef6-408e-8034-46506b36b5d2" containerName="route-controller-manager" containerID="cri-o://29da0e72aa1a43c393446d564de1d3d45c9f96b46951dd175c5753cf31027e67" gracePeriod=30 Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:11.999888 4563 generic.go:334] "Generic (PLEG): container finished" podID="99e63a17-8605-4830-96b4-dd619cf76549" containerID="65d992faf76f9dded47291f2d921cf054d303707053bb4b4aa54580a19aebaac" exitCode=0 Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:11.999954 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" event={"ID":"99e63a17-8605-4830-96b4-dd619cf76549","Type":"ContainerDied","Data":"65d992faf76f9dded47291f2d921cf054d303707053bb4b4aa54580a19aebaac"} Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.001267 4563 generic.go:334] "Generic (PLEG): container finished" podID="e7c61447-7ef6-408e-8034-46506b36b5d2" containerID="29da0e72aa1a43c393446d564de1d3d45c9f96b46951dd175c5753cf31027e67" exitCode=0 Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.001299 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" event={"ID":"e7c61447-7ef6-408e-8034-46506b36b5d2","Type":"ContainerDied","Data":"29da0e72aa1a43c393446d564de1d3d45c9f96b46951dd175c5753cf31027e67"} Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.065540 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.068791 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.145246 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tkg5\" (UniqueName: \"kubernetes.io/projected/99e63a17-8605-4830-96b4-dd619cf76549-kube-api-access-4tkg5\") pod \"99e63a17-8605-4830-96b4-dd619cf76549\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.145291 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-config\") pod \"99e63a17-8605-4830-96b4-dd619cf76549\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.145343 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c61447-7ef6-408e-8034-46506b36b5d2-config\") pod \"e7c61447-7ef6-408e-8034-46506b36b5d2\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.145370 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-proxy-ca-bundles\") pod \"99e63a17-8605-4830-96b4-dd619cf76549\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.145396 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c61447-7ef6-408e-8034-46506b36b5d2-serving-cert\") pod \"e7c61447-7ef6-408e-8034-46506b36b5d2\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.145428 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-client-ca\") pod \"99e63a17-8605-4830-96b4-dd619cf76549\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.145442 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99e63a17-8605-4830-96b4-dd619cf76549-serving-cert\") pod \"99e63a17-8605-4830-96b4-dd619cf76549\" (UID: \"99e63a17-8605-4830-96b4-dd619cf76549\") " Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.145494 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw2j4\" (UniqueName: \"kubernetes.io/projected/e7c61447-7ef6-408e-8034-46506b36b5d2-kube-api-access-tw2j4\") pod \"e7c61447-7ef6-408e-8034-46506b36b5d2\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.145522 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7c61447-7ef6-408e-8034-46506b36b5d2-client-ca\") pod \"e7c61447-7ef6-408e-8034-46506b36b5d2\" (UID: \"e7c61447-7ef6-408e-8034-46506b36b5d2\") " Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.146478 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c61447-7ef6-408e-8034-46506b36b5d2-config" (OuterVolumeSpecName: "config") pod "e7c61447-7ef6-408e-8034-46506b36b5d2" (UID: "e7c61447-7ef6-408e-8034-46506b36b5d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.146916 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-config" (OuterVolumeSpecName: "config") pod "99e63a17-8605-4830-96b4-dd619cf76549" (UID: "99e63a17-8605-4830-96b4-dd619cf76549"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.147430 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c61447-7ef6-408e-8034-46506b36b5d2-client-ca" (OuterVolumeSpecName: "client-ca") pod "e7c61447-7ef6-408e-8034-46506b36b5d2" (UID: "e7c61447-7ef6-408e-8034-46506b36b5d2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.147706 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "99e63a17-8605-4830-96b4-dd619cf76549" (UID: "99e63a17-8605-4830-96b4-dd619cf76549"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.150102 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-client-ca" (OuterVolumeSpecName: "client-ca") pod "99e63a17-8605-4830-96b4-dd619cf76549" (UID: "99e63a17-8605-4830-96b4-dd619cf76549"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.154727 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e63a17-8605-4830-96b4-dd619cf76549-kube-api-access-4tkg5" (OuterVolumeSpecName: "kube-api-access-4tkg5") pod "99e63a17-8605-4830-96b4-dd619cf76549" (UID: "99e63a17-8605-4830-96b4-dd619cf76549"). InnerVolumeSpecName "kube-api-access-4tkg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.158088 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c61447-7ef6-408e-8034-46506b36b5d2-kube-api-access-tw2j4" (OuterVolumeSpecName: "kube-api-access-tw2j4") pod "e7c61447-7ef6-408e-8034-46506b36b5d2" (UID: "e7c61447-7ef6-408e-8034-46506b36b5d2"). InnerVolumeSpecName "kube-api-access-tw2j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.158478 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99e63a17-8605-4830-96b4-dd619cf76549-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "99e63a17-8605-4830-96b4-dd619cf76549" (UID: "99e63a17-8605-4830-96b4-dd619cf76549"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.160847 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c61447-7ef6-408e-8034-46506b36b5d2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7c61447-7ef6-408e-8034-46506b36b5d2" (UID: "e7c61447-7ef6-408e-8034-46506b36b5d2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.247570 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw2j4\" (UniqueName: \"kubernetes.io/projected/e7c61447-7ef6-408e-8034-46506b36b5d2-kube-api-access-tw2j4\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.247602 4563 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7c61447-7ef6-408e-8034-46506b36b5d2-client-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.247614 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.247624 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tkg5\" (UniqueName: \"kubernetes.io/projected/99e63a17-8605-4830-96b4-dd619cf76549-kube-api-access-4tkg5\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.247631 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c61447-7ef6-408e-8034-46506b36b5d2-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.247653 4563 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.247662 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c61447-7ef6-408e-8034-46506b36b5d2-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.247670 4563 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99e63a17-8605-4830-96b4-dd619cf76549-client-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:12 crc kubenswrapper[4563]: I1124 09:16:12.247677 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99e63a17-8605-4830-96b4-dd619cf76549-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.007077 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.007109 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd" event={"ID":"e7c61447-7ef6-408e-8034-46506b36b5d2","Type":"ContainerDied","Data":"1cb9ea1cb353326b34a8c8b43a54788bc63dfeba339982fc64587a588d8fa8d0"} Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.007162 4563 scope.go:117] "RemoveContainer" containerID="29da0e72aa1a43c393446d564de1d3d45c9f96b46951dd175c5753cf31027e67" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.010754 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" event={"ID":"99e63a17-8605-4830-96b4-dd619cf76549","Type":"ContainerDied","Data":"642ca4d153567074abf47780a4f5a2029c2adaabb3592b271b1b3b0c3e9e6225"} Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.010819 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-g9q5v" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.029349 4563 scope.go:117] "RemoveContainer" containerID="65d992faf76f9dded47291f2d921cf054d303707053bb4b4aa54580a19aebaac" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.030324 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.033541 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rtrwd"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.039563 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g9q5v"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.042487 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-g9q5v"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.061966 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e63a17-8605-4830-96b4-dd619cf76549" path="/var/lib/kubelet/pods/99e63a17-8605-4830-96b4-dd619cf76549/volumes" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.062551 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c61447-7ef6-408e-8034-46506b36b5d2" path="/var/lib/kubelet/pods/e7c61447-7ef6-408e-8034-46506b36b5d2/volumes" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.271298 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn"] Nov 24 09:16:13 crc kubenswrapper[4563]: E1124 09:16:13.271956 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c61447-7ef6-408e-8034-46506b36b5d2" containerName="route-controller-manager" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.271976 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c61447-7ef6-408e-8034-46506b36b5d2" containerName="route-controller-manager" Nov 24 09:16:13 crc kubenswrapper[4563]: E1124 09:16:13.271988 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e63a17-8605-4830-96b4-dd619cf76549" containerName="controller-manager" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.271996 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e63a17-8605-4830-96b4-dd619cf76549" containerName="controller-manager" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.272171 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e63a17-8605-4830-96b4-dd619cf76549" containerName="controller-manager" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.272204 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c61447-7ef6-408e-8034-46506b36b5d2" containerName="route-controller-manager" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.272840 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.274366 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.274685 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.274795 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.274824 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.275161 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.275172 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.275290 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.277374 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.277869 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.277925 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.277930 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.278211 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.278387 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.278483 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.283216 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.294698 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.301190 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.360971 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-client-ca\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.361021 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz68l\" (UniqueName: \"kubernetes.io/projected/b5284336-a335-4d2f-a960-d133a6b32dc6-kube-api-access-mz68l\") pod \"route-controller-manager-57f4b94764-cp6cn\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.361160 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ebd330-0f96-4619-992b-d73bddd5ca58-serving-cert\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.361204 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5284336-a335-4d2f-a960-d133a6b32dc6-serving-cert\") pod \"route-controller-manager-57f4b94764-cp6cn\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.361277 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-proxy-ca-bundles\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.361361 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-config\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.361401 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5284336-a335-4d2f-a960-d133a6b32dc6-client-ca\") pod \"route-controller-manager-57f4b94764-cp6cn\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.361427 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5284336-a335-4d2f-a960-d133a6b32dc6-config\") pod \"route-controller-manager-57f4b94764-cp6cn\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.361454 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v77nk\" (UniqueName: \"kubernetes.io/projected/d2ebd330-0f96-4619-992b-d73bddd5ca58-kube-api-access-v77nk\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.413508 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6"] Nov 24 09:16:13 crc kubenswrapper[4563]: E1124 09:16:13.414455 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-v77nk proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" podUID="d2ebd330-0f96-4619-992b-d73bddd5ca58" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.421019 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn"] Nov 24 09:16:13 crc kubenswrapper[4563]: E1124 09:16:13.421603 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-mz68l serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" podUID="b5284336-a335-4d2f-a960-d133a6b32dc6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.463121 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ebd330-0f96-4619-992b-d73bddd5ca58-serving-cert\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.463348 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5284336-a335-4d2f-a960-d133a6b32dc6-serving-cert\") pod \"route-controller-manager-57f4b94764-cp6cn\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.463478 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-proxy-ca-bundles\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.463610 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-config\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.463721 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5284336-a335-4d2f-a960-d133a6b32dc6-client-ca\") pod \"route-controller-manager-57f4b94764-cp6cn\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.463797 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5284336-a335-4d2f-a960-d133a6b32dc6-config\") pod \"route-controller-manager-57f4b94764-cp6cn\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.463881 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v77nk\" (UniqueName: \"kubernetes.io/projected/d2ebd330-0f96-4619-992b-d73bddd5ca58-kube-api-access-v77nk\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.464023 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-client-ca\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.465013 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz68l\" (UniqueName: \"kubernetes.io/projected/b5284336-a335-4d2f-a960-d133a6b32dc6-kube-api-access-mz68l\") pod \"route-controller-manager-57f4b94764-cp6cn\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.464557 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-proxy-ca-bundles\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.464830 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-config\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.464954 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-client-ca\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.464961 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5284336-a335-4d2f-a960-d133a6b32dc6-config\") pod \"route-controller-manager-57f4b94764-cp6cn\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.464793 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5284336-a335-4d2f-a960-d133a6b32dc6-client-ca\") pod \"route-controller-manager-57f4b94764-cp6cn\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.468521 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ebd330-0f96-4619-992b-d73bddd5ca58-serving-cert\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.470224 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5284336-a335-4d2f-a960-d133a6b32dc6-serving-cert\") pod \"route-controller-manager-57f4b94764-cp6cn\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.477531 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v77nk\" (UniqueName: \"kubernetes.io/projected/d2ebd330-0f96-4619-992b-d73bddd5ca58-kube-api-access-v77nk\") pod \"controller-manager-66d44fd8b6-2vgq6\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.477929 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz68l\" (UniqueName: \"kubernetes.io/projected/b5284336-a335-4d2f-a960-d133a6b32dc6-kube-api-access-mz68l\") pod \"route-controller-manager-57f4b94764-cp6cn\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.862798 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7768f8c84f-glf4s"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.863714 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7768f8c84f-glf4s" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.865987 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-9lcgw" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.873709 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7768f8c84f-glf4s"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.897544 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d8fd67bf7-jnx9f"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.900900 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6d8fd67bf7-jnx9f" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.903394 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-mn69r" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.911084 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d8fd67bf7-jnx9f"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.948961 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-56dfb6b67f-77wgb"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.950186 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-56dfb6b67f-77wgb" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.951938 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-65qbh" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.954074 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-56dfb6b67f-77wgb"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.960983 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8667fbf6f6-k9wzp"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.962245 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8667fbf6f6-k9wzp" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.962999 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8667fbf6f6-k9wzp"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.964053 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-pdttk" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.973795 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr6qc\" (UniqueName: \"kubernetes.io/projected/f81c148e-bf8e-4b57-895e-f2c11411cf7a-kube-api-access-vr6qc\") pod \"barbican-operator-controller-manager-7768f8c84f-glf4s\" (UID: \"f81c148e-bf8e-4b57-895e-f2c11411cf7a\") " pod="openstack-operators/barbican-operator-controller-manager-7768f8c84f-glf4s" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.973825 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twppz\" (UniqueName: \"kubernetes.io/projected/a62a6523-e592-437f-b3ba-320e24f619dc-kube-api-access-twppz\") pod \"cinder-operator-controller-manager-6d8fd67bf7-jnx9f\" (UID: \"a62a6523-e592-437f-b3ba-320e24f619dc\") " pod="openstack-operators/cinder-operator-controller-manager-6d8fd67bf7-jnx9f" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.974205 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-bf4c6585d-tnxst"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.975215 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-bf4c6585d-tnxst" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.977191 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-pbw2v" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.981215 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-bf4c6585d-tnxst"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.994731 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d86b44686-4x76m"] Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.995867 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d86b44686-4x76m" Nov 24 09:16:13 crc kubenswrapper[4563]: I1124 09:16:13.998936 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-xfbjj" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.006808 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d86b44686-4x76m"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.010835 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.011806 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.020787 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.021349 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.024786 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-s77dd" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.024952 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.025847 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.037082 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5c75d7c94b-ltqbl"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.037613 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.037932 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5c75d7c94b-ltqbl" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.042310 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-l7dpk" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.046929 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.075069 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5284336-a335-4d2f-a960-d133a6b32dc6-client-ca\") pod \"b5284336-a335-4d2f-a960-d133a6b32dc6\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.075127 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5284336-a335-4d2f-a960-d133a6b32dc6-serving-cert\") pod \"b5284336-a335-4d2f-a960-d133a6b32dc6\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.075163 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz68l\" (UniqueName: \"kubernetes.io/projected/b5284336-a335-4d2f-a960-d133a6b32dc6-kube-api-access-mz68l\") pod \"b5284336-a335-4d2f-a960-d133a6b32dc6\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.075182 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5284336-a335-4d2f-a960-d133a6b32dc6-config\") pod \"b5284336-a335-4d2f-a960-d133a6b32dc6\" (UID: \"b5284336-a335-4d2f-a960-d133a6b32dc6\") " Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.075465 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c8mm\" (UniqueName: \"kubernetes.io/projected/68eeb4a0-b192-4e6a-b02b-f34415b29316-kube-api-access-4c8mm\") pod \"infra-operator-controller-manager-769d9c7585-4f5hq\" (UID: \"68eeb4a0-b192-4e6a-b02b-f34415b29316\") " pod="openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.075516 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjq9r\" (UniqueName: \"kubernetes.io/projected/17904228-d0e5-489c-a965-5cba44f3b3f2-kube-api-access-cjq9r\") pod \"designate-operator-controller-manager-56dfb6b67f-77wgb\" (UID: \"17904228-d0e5-489c-a965-5cba44f3b3f2\") " pod="openstack-operators/designate-operator-controller-manager-56dfb6b67f-77wgb" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.075567 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68eeb4a0-b192-4e6a-b02b-f34415b29316-cert\") pod \"infra-operator-controller-manager-769d9c7585-4f5hq\" (UID: \"68eeb4a0-b192-4e6a-b02b-f34415b29316\") " pod="openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.075626 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr6qc\" (UniqueName: \"kubernetes.io/projected/f81c148e-bf8e-4b57-895e-f2c11411cf7a-kube-api-access-vr6qc\") pod \"barbican-operator-controller-manager-7768f8c84f-glf4s\" (UID: \"f81c148e-bf8e-4b57-895e-f2c11411cf7a\") " pod="openstack-operators/barbican-operator-controller-manager-7768f8c84f-glf4s" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.076104 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mc9l\" (UniqueName: \"kubernetes.io/projected/70a63634-9a9f-46b3-af05-9dc02c0a03e1-kube-api-access-8mc9l\") pod \"horizon-operator-controller-manager-5d86b44686-4x76m\" (UID: \"70a63634-9a9f-46b3-af05-9dc02c0a03e1\") " pod="openstack-operators/horizon-operator-controller-manager-5d86b44686-4x76m" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.076148 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv8fk\" (UniqueName: \"kubernetes.io/projected/26aa13a3-737a-457f-9d46-29018cfccd1e-kube-api-access-gv8fk\") pod \"ironic-operator-controller-manager-5c75d7c94b-ltqbl\" (UID: \"26aa13a3-737a-457f-9d46-29018cfccd1e\") " pod="openstack-operators/ironic-operator-controller-manager-5c75d7c94b-ltqbl" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.076180 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twppz\" (UniqueName: \"kubernetes.io/projected/a62a6523-e592-437f-b3ba-320e24f619dc-kube-api-access-twppz\") pod \"cinder-operator-controller-manager-6d8fd67bf7-jnx9f\" (UID: \"a62a6523-e592-437f-b3ba-320e24f619dc\") " pod="openstack-operators/cinder-operator-controller-manager-6d8fd67bf7-jnx9f" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.076237 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2fgt\" (UniqueName: \"kubernetes.io/projected/b4f4311c-5634-4bae-8659-5efa662f0562-kube-api-access-v2fgt\") pod \"heat-operator-controller-manager-bf4c6585d-tnxst\" (UID: \"b4f4311c-5634-4bae-8659-5efa662f0562\") " pod="openstack-operators/heat-operator-controller-manager-bf4c6585d-tnxst" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.076262 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vpwc\" (UniqueName: \"kubernetes.io/projected/77d539d7-5235-4576-a276-8247c5824020-kube-api-access-6vpwc\") pod \"glance-operator-controller-manager-8667fbf6f6-k9wzp\" (UID: \"77d539d7-5235-4576-a276-8247c5824020\") " pod="openstack-operators/glance-operator-controller-manager-8667fbf6f6-k9wzp" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.077787 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7bb88cb858-44jfn"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.078418 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5284336-a335-4d2f-a960-d133a6b32dc6-client-ca" (OuterVolumeSpecName: "client-ca") pod "b5284336-a335-4d2f-a960-d133a6b32dc6" (UID: "b5284336-a335-4d2f-a960-d133a6b32dc6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.078584 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5284336-a335-4d2f-a960-d133a6b32dc6-config" (OuterVolumeSpecName: "config") pod "b5284336-a335-4d2f-a960-d133a6b32dc6" (UID: "b5284336-a335-4d2f-a960-d133a6b32dc6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.078843 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7bb88cb858-44jfn" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.081413 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5284336-a335-4d2f-a960-d133a6b32dc6-kube-api-access-mz68l" (OuterVolumeSpecName: "kube-api-access-mz68l") pod "b5284336-a335-4d2f-a960-d133a6b32dc6" (UID: "b5284336-a335-4d2f-a960-d133a6b32dc6"). InnerVolumeSpecName "kube-api-access-mz68l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.081784 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5284336-a335-4d2f-a960-d133a6b32dc6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b5284336-a335-4d2f-a960-d133a6b32dc6" (UID: "b5284336-a335-4d2f-a960-d133a6b32dc6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.085542 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7879fb76fd-4tv9l"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.086472 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7879fb76fd-4tv9l" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.089236 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5c75d7c94b-ltqbl"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.093387 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7bb88cb858-44jfn"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.096236 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7879fb76fd-4tv9l"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.106167 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-fbwdn" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.106326 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-x9cj5" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.110890 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.111942 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.112049 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.116685 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.117148 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr6qc\" (UniqueName: \"kubernetes.io/projected/f81c148e-bf8e-4b57-895e-f2c11411cf7a-kube-api-access-vr6qc\") pod \"barbican-operator-controller-manager-7768f8c84f-glf4s\" (UID: \"f81c148e-bf8e-4b57-895e-f2c11411cf7a\") " pod="openstack-operators/barbican-operator-controller-manager-7768f8c84f-glf4s" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.117855 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.121520 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-9k484" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.121739 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.122978 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-86d796d84d-vkltr"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.123874 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-86d796d84d-vkltr" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.144282 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-86d796d84d-vkltr"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.147207 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twppz\" (UniqueName: \"kubernetes.io/projected/a62a6523-e592-437f-b3ba-320e24f619dc-kube-api-access-twppz\") pod \"cinder-operator-controller-manager-6d8fd67bf7-jnx9f\" (UID: \"a62a6523-e592-437f-b3ba-320e24f619dc\") " pod="openstack-operators/cinder-operator-controller-manager-6d8fd67bf7-jnx9f" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177041 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-client-ca\") pod \"d2ebd330-0f96-4619-992b-d73bddd5ca58\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177101 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v77nk\" (UniqueName: \"kubernetes.io/projected/d2ebd330-0f96-4619-992b-d73bddd5ca58-kube-api-access-v77nk\") pod \"d2ebd330-0f96-4619-992b-d73bddd5ca58\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177139 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-proxy-ca-bundles\") pod \"d2ebd330-0f96-4619-992b-d73bddd5ca58\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177169 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-config\") pod \"d2ebd330-0f96-4619-992b-d73bddd5ca58\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177216 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ebd330-0f96-4619-992b-d73bddd5ca58-serving-cert\") pod \"d2ebd330-0f96-4619-992b-d73bddd5ca58\" (UID: \"d2ebd330-0f96-4619-992b-d73bddd5ca58\") " Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177412 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xjjz\" (UniqueName: \"kubernetes.io/projected/c089c738-65b8-46e2-91c9-59b962081c05-kube-api-access-9xjjz\") pod \"nova-operator-controller-manager-86d796d84d-vkltr\" (UID: \"c089c738-65b8-46e2-91c9-59b962081c05\") " pod="openstack-operators/nova-operator-controller-manager-86d796d84d-vkltr" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177493 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68eeb4a0-b192-4e6a-b02b-f34415b29316-cert\") pod \"infra-operator-controller-manager-769d9c7585-4f5hq\" (UID: \"68eeb4a0-b192-4e6a-b02b-f34415b29316\") " pod="openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177549 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62pfg\" (UniqueName: \"kubernetes.io/projected/a30aea9a-f4c8-42a3-89bb-af9ffef55544-kube-api-access-62pfg\") pod \"mariadb-operator-controller-manager-6f8c5b86cb-94tjk\" (UID: \"a30aea9a-f4c8-42a3-89bb-af9ffef55544\") " pod="openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177592 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mc9l\" (UniqueName: \"kubernetes.io/projected/70a63634-9a9f-46b3-af05-9dc02c0a03e1-kube-api-access-8mc9l\") pod \"horizon-operator-controller-manager-5d86b44686-4x76m\" (UID: \"70a63634-9a9f-46b3-af05-9dc02c0a03e1\") " pod="openstack-operators/horizon-operator-controller-manager-5d86b44686-4x76m" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177612 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv8fk\" (UniqueName: \"kubernetes.io/projected/26aa13a3-737a-457f-9d46-29018cfccd1e-kube-api-access-gv8fk\") pod \"ironic-operator-controller-manager-5c75d7c94b-ltqbl\" (UID: \"26aa13a3-737a-457f-9d46-29018cfccd1e\") " pod="openstack-operators/ironic-operator-controller-manager-5c75d7c94b-ltqbl" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177660 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlwls\" (UniqueName: \"kubernetes.io/projected/13a3e7f4-4c3d-46e7-a7ee-612f9ac17bc7-kube-api-access-jlwls\") pod \"manila-operator-controller-manager-7bb88cb858-44jfn\" (UID: \"13a3e7f4-4c3d-46e7-a7ee-612f9ac17bc7\") " pod="openstack-operators/manila-operator-controller-manager-7bb88cb858-44jfn" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177706 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd4dl\" (UniqueName: \"kubernetes.io/projected/ebed0d67-0bac-4d1f-a2d0-2e367d78d157-kube-api-access-kd4dl\") pod \"keystone-operator-controller-manager-7879fb76fd-4tv9l\" (UID: \"ebed0d67-0bac-4d1f-a2d0-2e367d78d157\") " pod="openstack-operators/keystone-operator-controller-manager-7879fb76fd-4tv9l" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177728 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8k6q\" (UniqueName: \"kubernetes.io/projected/ffcb9e74-1697-402a-b77b-5a3ecc832759-kube-api-access-m8k6q\") pod \"neutron-operator-controller-manager-66b7d6f598-fffcm\" (UID: \"ffcb9e74-1697-402a-b77b-5a3ecc832759\") " pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177746 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2fgt\" (UniqueName: \"kubernetes.io/projected/b4f4311c-5634-4bae-8659-5efa662f0562-kube-api-access-v2fgt\") pod \"heat-operator-controller-manager-bf4c6585d-tnxst\" (UID: \"b4f4311c-5634-4bae-8659-5efa662f0562\") " pod="openstack-operators/heat-operator-controller-manager-bf4c6585d-tnxst" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177757 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-client-ca" (OuterVolumeSpecName: "client-ca") pod "d2ebd330-0f96-4619-992b-d73bddd5ca58" (UID: "d2ebd330-0f96-4619-992b-d73bddd5ca58"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177771 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vpwc\" (UniqueName: \"kubernetes.io/projected/77d539d7-5235-4576-a276-8247c5824020-kube-api-access-6vpwc\") pod \"glance-operator-controller-manager-8667fbf6f6-k9wzp\" (UID: \"77d539d7-5235-4576-a276-8247c5824020\") " pod="openstack-operators/glance-operator-controller-manager-8667fbf6f6-k9wzp" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177851 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c8mm\" (UniqueName: \"kubernetes.io/projected/68eeb4a0-b192-4e6a-b02b-f34415b29316-kube-api-access-4c8mm\") pod \"infra-operator-controller-manager-769d9c7585-4f5hq\" (UID: \"68eeb4a0-b192-4e6a-b02b-f34415b29316\") " pod="openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177911 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjq9r\" (UniqueName: \"kubernetes.io/projected/17904228-d0e5-489c-a965-5cba44f3b3f2-kube-api-access-cjq9r\") pod \"designate-operator-controller-manager-56dfb6b67f-77wgb\" (UID: \"17904228-d0e5-489c-a965-5cba44f3b3f2\") " pod="openstack-operators/designate-operator-controller-manager-56dfb6b67f-77wgb" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.177990 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz68l\" (UniqueName: \"kubernetes.io/projected/b5284336-a335-4d2f-a960-d133a6b32dc6-kube-api-access-mz68l\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.178003 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5284336-a335-4d2f-a960-d133a6b32dc6-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.178012 4563 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-client-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.178023 4563 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5284336-a335-4d2f-a960-d133a6b32dc6-client-ca\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.178032 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5284336-a335-4d2f-a960-d133a6b32dc6-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.178500 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d2ebd330-0f96-4619-992b-d73bddd5ca58" (UID: "d2ebd330-0f96-4619-992b-d73bddd5ca58"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.178958 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-config" (OuterVolumeSpecName: "config") pod "d2ebd330-0f96-4619-992b-d73bddd5ca58" (UID: "d2ebd330-0f96-4619-992b-d73bddd5ca58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.184324 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7768f8c84f-glf4s" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.196101 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68eeb4a0-b192-4e6a-b02b-f34415b29316-cert\") pod \"infra-operator-controller-manager-769d9c7585-4f5hq\" (UID: \"68eeb4a0-b192-4e6a-b02b-f34415b29316\") " pod="openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.196167 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6fdc856c5d-h78s9"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.196418 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ebd330-0f96-4619-992b-d73bddd5ca58-kube-api-access-v77nk" (OuterVolumeSpecName: "kube-api-access-v77nk") pod "d2ebd330-0f96-4619-992b-d73bddd5ca58" (UID: "d2ebd330-0f96-4619-992b-d73bddd5ca58"). InnerVolumeSpecName "kube-api-access-v77nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.197348 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6fdc856c5d-h78s9" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.199813 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2ebd330-0f96-4619-992b-d73bddd5ca58-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d2ebd330-0f96-4619-992b-d73bddd5ca58" (UID: "d2ebd330-0f96-4619-992b-d73bddd5ca58"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.206310 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6fdc856c5d-h78s9"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.227123 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c8mm\" (UniqueName: \"kubernetes.io/projected/68eeb4a0-b192-4e6a-b02b-f34415b29316-kube-api-access-4c8mm\") pod \"infra-operator-controller-manager-769d9c7585-4f5hq\" (UID: \"68eeb4a0-b192-4e6a-b02b-f34415b29316\") " pod="openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.237107 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv8fk\" (UniqueName: \"kubernetes.io/projected/26aa13a3-737a-457f-9d46-29018cfccd1e-kube-api-access-gv8fk\") pod \"ironic-operator-controller-manager-5c75d7c94b-ltqbl\" (UID: \"26aa13a3-737a-457f-9d46-29018cfccd1e\") " pod="openstack-operators/ironic-operator-controller-manager-5c75d7c94b-ltqbl" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.243131 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2fgt\" (UniqueName: \"kubernetes.io/projected/b4f4311c-5634-4bae-8659-5efa662f0562-kube-api-access-v2fgt\") pod \"heat-operator-controller-manager-bf4c6585d-tnxst\" (UID: \"b4f4311c-5634-4bae-8659-5efa662f0562\") " pod="openstack-operators/heat-operator-controller-manager-bf4c6585d-tnxst" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.244909 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6d8fd67bf7-jnx9f" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.257241 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vpwc\" (UniqueName: \"kubernetes.io/projected/77d539d7-5235-4576-a276-8247c5824020-kube-api-access-6vpwc\") pod \"glance-operator-controller-manager-8667fbf6f6-k9wzp\" (UID: \"77d539d7-5235-4576-a276-8247c5824020\") " pod="openstack-operators/glance-operator-controller-manager-8667fbf6f6-k9wzp" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.263119 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjq9r\" (UniqueName: \"kubernetes.io/projected/17904228-d0e5-489c-a965-5cba44f3b3f2-kube-api-access-cjq9r\") pod \"designate-operator-controller-manager-56dfb6b67f-77wgb\" (UID: \"17904228-d0e5-489c-a965-5cba44f3b3f2\") " pod="openstack-operators/designate-operator-controller-manager-56dfb6b67f-77wgb" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.265035 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mc9l\" (UniqueName: \"kubernetes.io/projected/70a63634-9a9f-46b3-af05-9dc02c0a03e1-kube-api-access-8mc9l\") pod \"horizon-operator-controller-manager-5d86b44686-4x76m\" (UID: \"70a63634-9a9f-46b3-af05-9dc02c0a03e1\") " pod="openstack-operators/horizon-operator-controller-manager-5d86b44686-4x76m" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.266303 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-6dc664666c-6flr8"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.285211 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-56dfb6b67f-77wgb" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.285745 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8667fbf6f6-k9wzp" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.286911 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-6dc664666c-6flr8" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.287665 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xjjz\" (UniqueName: \"kubernetes.io/projected/c089c738-65b8-46e2-91c9-59b962081c05-kube-api-access-9xjjz\") pod \"nova-operator-controller-manager-86d796d84d-vkltr\" (UID: \"c089c738-65b8-46e2-91c9-59b962081c05\") " pod="openstack-operators/nova-operator-controller-manager-86d796d84d-vkltr" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.287717 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62pfg\" (UniqueName: \"kubernetes.io/projected/a30aea9a-f4c8-42a3-89bb-af9ffef55544-kube-api-access-62pfg\") pod \"mariadb-operator-controller-manager-6f8c5b86cb-94tjk\" (UID: \"a30aea9a-f4c8-42a3-89bb-af9ffef55544\") " pod="openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.287758 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlwls\" (UniqueName: \"kubernetes.io/projected/13a3e7f4-4c3d-46e7-a7ee-612f9ac17bc7-kube-api-access-jlwls\") pod \"manila-operator-controller-manager-7bb88cb858-44jfn\" (UID: \"13a3e7f4-4c3d-46e7-a7ee-612f9ac17bc7\") " pod="openstack-operators/manila-operator-controller-manager-7bb88cb858-44jfn" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.287804 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zgk6\" (UniqueName: \"kubernetes.io/projected/71d78263-9c76-454f-8b9f-1392c9fcfc2f-kube-api-access-9zgk6\") pod \"octavia-operator-controller-manager-6fdc856c5d-h78s9\" (UID: \"71d78263-9c76-454f-8b9f-1392c9fcfc2f\") " pod="openstack-operators/octavia-operator-controller-manager-6fdc856c5d-h78s9" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.287838 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd4dl\" (UniqueName: \"kubernetes.io/projected/ebed0d67-0bac-4d1f-a2d0-2e367d78d157-kube-api-access-kd4dl\") pod \"keystone-operator-controller-manager-7879fb76fd-4tv9l\" (UID: \"ebed0d67-0bac-4d1f-a2d0-2e367d78d157\") " pod="openstack-operators/keystone-operator-controller-manager-7879fb76fd-4tv9l" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.287861 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8k6q\" (UniqueName: \"kubernetes.io/projected/ffcb9e74-1697-402a-b77b-5a3ecc832759-kube-api-access-m8k6q\") pod \"neutron-operator-controller-manager-66b7d6f598-fffcm\" (UID: \"ffcb9e74-1697-402a-b77b-5a3ecc832759\") " pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.287931 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v77nk\" (UniqueName: \"kubernetes.io/projected/d2ebd330-0f96-4619-992b-d73bddd5ca58-kube-api-access-v77nk\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.287943 4563 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.287953 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ebd330-0f96-4619-992b-d73bddd5ca58-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.287966 4563 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ebd330-0f96-4619-992b-d73bddd5ca58-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.298394 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-bf4c6585d-tnxst" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.305455 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.311415 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.318789 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d86b44686-4x76m" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.339168 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.348443 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8k6q\" (UniqueName: \"kubernetes.io/projected/ffcb9e74-1697-402a-b77b-5a3ecc832759-kube-api-access-m8k6q\") pod \"neutron-operator-controller-manager-66b7d6f598-fffcm\" (UID: \"ffcb9e74-1697-402a-b77b-5a3ecc832759\") " pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.357141 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5c75d7c94b-ltqbl" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.359256 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.366527 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd4dl\" (UniqueName: \"kubernetes.io/projected/ebed0d67-0bac-4d1f-a2d0-2e367d78d157-kube-api-access-kd4dl\") pod \"keystone-operator-controller-manager-7879fb76fd-4tv9l\" (UID: \"ebed0d67-0bac-4d1f-a2d0-2e367d78d157\") " pod="openstack-operators/keystone-operator-controller-manager-7879fb76fd-4tv9l" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.368454 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xjjz\" (UniqueName: \"kubernetes.io/projected/c089c738-65b8-46e2-91c9-59b962081c05-kube-api-access-9xjjz\") pod \"nova-operator-controller-manager-86d796d84d-vkltr\" (UID: \"c089c738-65b8-46e2-91c9-59b962081c05\") " pod="openstack-operators/nova-operator-controller-manager-86d796d84d-vkltr" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.371728 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlwls\" (UniqueName: \"kubernetes.io/projected/13a3e7f4-4c3d-46e7-a7ee-612f9ac17bc7-kube-api-access-jlwls\") pod \"manila-operator-controller-manager-7bb88cb858-44jfn\" (UID: \"13a3e7f4-4c3d-46e7-a7ee-612f9ac17bc7\") " pod="openstack-operators/manila-operator-controller-manager-7bb88cb858-44jfn" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.389508 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vptp\" (UniqueName: \"kubernetes.io/projected/974a1619-7c48-46d6-b639-5f965c6b747a-kube-api-access-8vptp\") pod \"openstack-baremetal-operator-controller-manager-79d88dcd444qmtr\" (UID: \"974a1619-7c48-46d6-b639-5f965c6b747a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.389573 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8mqx\" (UniqueName: \"kubernetes.io/projected/6a018387-ddf9-40f3-a421-d1a760581c8f-kube-api-access-s8mqx\") pod \"placement-operator-controller-manager-6dc664666c-6flr8\" (UID: \"6a018387-ddf9-40f3-a421-d1a760581c8f\") " pod="openstack-operators/placement-operator-controller-manager-6dc664666c-6flr8" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.389620 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zgk6\" (UniqueName: \"kubernetes.io/projected/71d78263-9c76-454f-8b9f-1392c9fcfc2f-kube-api-access-9zgk6\") pod \"octavia-operator-controller-manager-6fdc856c5d-h78s9\" (UID: \"71d78263-9c76-454f-8b9f-1392c9fcfc2f\") " pod="openstack-operators/octavia-operator-controller-manager-6fdc856c5d-h78s9" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.389705 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/974a1619-7c48-46d6-b639-5f965c6b747a-cert\") pod \"openstack-baremetal-operator-controller-manager-79d88dcd444qmtr\" (UID: \"974a1619-7c48-46d6-b639-5f965c6b747a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.420663 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.423038 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62pfg\" (UniqueName: \"kubernetes.io/projected/a30aea9a-f4c8-42a3-89bb-af9ffef55544-kube-api-access-62pfg\") pod \"mariadb-operator-controller-manager-6f8c5b86cb-94tjk\" (UID: \"a30aea9a-f4c8-42a3-89bb-af9ffef55544\") " pod="openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.424288 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.424501 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zgk6\" (UniqueName: \"kubernetes.io/projected/71d78263-9c76-454f-8b9f-1392c9fcfc2f-kube-api-access-9zgk6\") pod \"octavia-operator-controller-manager-6fdc856c5d-h78s9\" (UID: \"71d78263-9c76-454f-8b9f-1392c9fcfc2f\") " pod="openstack-operators/octavia-operator-controller-manager-6fdc856c5d-h78s9" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.459413 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-799cb6ffd6-wck8j"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.460556 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-799cb6ffd6-wck8j" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.473852 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7bb88cb858-44jfn" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.491664 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/974a1619-7c48-46d6-b639-5f965c6b747a-cert\") pod \"openstack-baremetal-operator-controller-manager-79d88dcd444qmtr\" (UID: \"974a1619-7c48-46d6-b639-5f965c6b747a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.491735 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vptp\" (UniqueName: \"kubernetes.io/projected/974a1619-7c48-46d6-b639-5f965c6b747a-kube-api-access-8vptp\") pod \"openstack-baremetal-operator-controller-manager-79d88dcd444qmtr\" (UID: \"974a1619-7c48-46d6-b639-5f965c6b747a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.491767 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjrv2\" (UniqueName: \"kubernetes.io/projected/31e8d237-829e-47b0-8a2c-8e316a37dc78-kube-api-access-gjrv2\") pod \"swift-operator-controller-manager-799cb6ffd6-wck8j\" (UID: \"31e8d237-829e-47b0-8a2c-8e316a37dc78\") " pod="openstack-operators/swift-operator-controller-manager-799cb6ffd6-wck8j" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.491789 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8mqx\" (UniqueName: \"kubernetes.io/projected/6a018387-ddf9-40f3-a421-d1a760581c8f-kube-api-access-s8mqx\") pod \"placement-operator-controller-manager-6dc664666c-6flr8\" (UID: \"6a018387-ddf9-40f3-a421-d1a760581c8f\") " pod="openstack-operators/placement-operator-controller-manager-6dc664666c-6flr8" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.491832 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwpv2\" (UniqueName: \"kubernetes.io/projected/9fb1ddc7-1195-412e-93ed-4799bc756bae-kube-api-access-cwpv2\") pod \"ovn-operator-controller-manager-5bdf4f7f7f-6n5jh\" (UID: \"9fb1ddc7-1195-412e-93ed-4799bc756bae\") " pod="openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh" Nov 24 09:16:14 crc kubenswrapper[4563]: E1124 09:16:14.491956 4563 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 24 09:16:14 crc kubenswrapper[4563]: E1124 09:16:14.492001 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/974a1619-7c48-46d6-b639-5f965c6b747a-cert podName:974a1619-7c48-46d6-b639-5f965c6b747a nodeName:}" failed. No retries permitted until 2025-11-24 09:16:14.99198582 +0000 UTC m=+752.250963267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/974a1619-7c48-46d6-b639-5f965c6b747a-cert") pod "openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" (UID: "974a1619-7c48-46d6-b639-5f965c6b747a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.492518 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-6dc664666c-6flr8"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.525286 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7879fb76fd-4tv9l" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.531404 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vptp\" (UniqueName: \"kubernetes.io/projected/974a1619-7c48-46d6-b639-5f965c6b747a-kube-api-access-8vptp\") pod \"openstack-baremetal-operator-controller-manager-79d88dcd444qmtr\" (UID: \"974a1619-7c48-46d6-b639-5f965c6b747a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.533412 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.551950 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.552418 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.553480 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.556149 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8mqx\" (UniqueName: \"kubernetes.io/projected/6a018387-ddf9-40f3-a421-d1a760581c8f-kube-api-access-s8mqx\") pod \"placement-operator-controller-manager-6dc664666c-6flr8\" (UID: \"6a018387-ddf9-40f3-a421-d1a760581c8f\") " pod="openstack-operators/placement-operator-controller-manager-6dc664666c-6flr8" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.556249 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.560142 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-799cb6ffd6-wck8j"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.586027 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.589992 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.594407 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-8464cf66df-chpfj"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.597693 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8464cf66df-chpfj" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.605461 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjrv2\" (UniqueName: \"kubernetes.io/projected/31e8d237-829e-47b0-8a2c-8e316a37dc78-kube-api-access-gjrv2\") pod \"swift-operator-controller-manager-799cb6ffd6-wck8j\" (UID: \"31e8d237-829e-47b0-8a2c-8e316a37dc78\") " pod="openstack-operators/swift-operator-controller-manager-799cb6ffd6-wck8j" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.605509 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89skb\" (UniqueName: \"kubernetes.io/projected/9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f-kube-api-access-89skb\") pod \"telemetry-operator-controller-manager-7798859c74-z5b6f\" (UID: \"9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f\") " pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.605554 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwpv2\" (UniqueName: \"kubernetes.io/projected/9fb1ddc7-1195-412e-93ed-4799bc756bae-kube-api-access-cwpv2\") pod \"ovn-operator-controller-manager-5bdf4f7f7f-6n5jh\" (UID: \"9fb1ddc7-1195-412e-93ed-4799bc756bae\") " pod="openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.606011 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8464cf66df-chpfj"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.606477 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-86d796d84d-vkltr" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.626883 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.627968 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.630956 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.638254 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjrv2\" (UniqueName: \"kubernetes.io/projected/31e8d237-829e-47b0-8a2c-8e316a37dc78-kube-api-access-gjrv2\") pod \"swift-operator-controller-manager-799cb6ffd6-wck8j\" (UID: \"31e8d237-829e-47b0-8a2c-8e316a37dc78\") " pod="openstack-operators/swift-operator-controller-manager-799cb6ffd6-wck8j" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.647085 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6fdc856c5d-h78s9" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.662787 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwpv2\" (UniqueName: \"kubernetes.io/projected/9fb1ddc7-1195-412e-93ed-4799bc756bae-kube-api-access-cwpv2\") pod \"ovn-operator-controller-manager-5bdf4f7f7f-6n5jh\" (UID: \"9fb1ddc7-1195-412e-93ed-4799bc756bae\") " pod="openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.662936 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.664404 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.667964 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.682109 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.687335 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7768f8c84f-glf4s"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.707307 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdec1b8b-630a-452a-b4d9-3cd42ef204c7-cert\") pod \"openstack-operator-controller-manager-6cb9dc54f8-m7w2q\" (UID: \"cdec1b8b-630a-452a-b4d9-3cd42ef204c7\") " pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.707359 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhhkt\" (UniqueName: \"kubernetes.io/projected/d5e12170-5cc0-4f8f-89d7-c64f38f2226e-kube-api-access-mhhkt\") pod \"watcher-operator-controller-manager-7cd4fb6f79-qhzw4\" (UID: \"d5e12170-5cc0-4f8f-89d7-c64f38f2226e\") " pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.707435 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdx6z\" (UniqueName: \"kubernetes.io/projected/cdec1b8b-630a-452a-b4d9-3cd42ef204c7-kube-api-access-wdx6z\") pod \"openstack-operator-controller-manager-6cb9dc54f8-m7w2q\" (UID: \"cdec1b8b-630a-452a-b4d9-3cd42ef204c7\") " pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.707487 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89skb\" (UniqueName: \"kubernetes.io/projected/9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f-kube-api-access-89skb\") pod \"telemetry-operator-controller-manager-7798859c74-z5b6f\" (UID: \"9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f\") " pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.707510 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-278qb\" (UniqueName: \"kubernetes.io/projected/238f517b-0e10-411c-8b3c-c6bdbe261159-kube-api-access-278qb\") pod \"test-operator-controller-manager-8464cf66df-chpfj\" (UID: \"238f517b-0e10-411c-8b3c-c6bdbe261159\") " pod="openstack-operators/test-operator-controller-manager-8464cf66df-chpfj" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.712849 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-6dc664666c-6flr8" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.750736 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g64g6"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.752122 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89skb\" (UniqueName: \"kubernetes.io/projected/9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f-kube-api-access-89skb\") pod \"telemetry-operator-controller-manager-7798859c74-z5b6f\" (UID: \"9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f\") " pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.752457 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g64g6" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.755832 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g64g6"] Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.797342 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.809214 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-278qb\" (UniqueName: \"kubernetes.io/projected/238f517b-0e10-411c-8b3c-c6bdbe261159-kube-api-access-278qb\") pod \"test-operator-controller-manager-8464cf66df-chpfj\" (UID: \"238f517b-0e10-411c-8b3c-c6bdbe261159\") " pod="openstack-operators/test-operator-controller-manager-8464cf66df-chpfj" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.809529 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdec1b8b-630a-452a-b4d9-3cd42ef204c7-cert\") pod \"openstack-operator-controller-manager-6cb9dc54f8-m7w2q\" (UID: \"cdec1b8b-630a-452a-b4d9-3cd42ef204c7\") " pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.809572 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhhkt\" (UniqueName: \"kubernetes.io/projected/d5e12170-5cc0-4f8f-89d7-c64f38f2226e-kube-api-access-mhhkt\") pod \"watcher-operator-controller-manager-7cd4fb6f79-qhzw4\" (UID: \"d5e12170-5cc0-4f8f-89d7-c64f38f2226e\") " pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.809695 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdx6z\" (UniqueName: \"kubernetes.io/projected/cdec1b8b-630a-452a-b4d9-3cd42ef204c7-kube-api-access-wdx6z\") pod \"openstack-operator-controller-manager-6cb9dc54f8-m7w2q\" (UID: \"cdec1b8b-630a-452a-b4d9-3cd42ef204c7\") " pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.809731 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfbhk\" (UniqueName: \"kubernetes.io/projected/00f5e4f8-193c-48df-b29f-8f359f263a5a-kube-api-access-dfbhk\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-g64g6\" (UID: \"00f5e4f8-193c-48df-b29f-8f359f263a5a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g64g6" Nov 24 09:16:14 crc kubenswrapper[4563]: E1124 09:16:14.809735 4563 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 24 09:16:14 crc kubenswrapper[4563]: E1124 09:16:14.809805 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdec1b8b-630a-452a-b4d9-3cd42ef204c7-cert podName:cdec1b8b-630a-452a-b4d9-3cd42ef204c7 nodeName:}" failed. No retries permitted until 2025-11-24 09:16:15.30978423 +0000 UTC m=+752.568761677 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cdec1b8b-630a-452a-b4d9-3cd42ef204c7-cert") pod "openstack-operator-controller-manager-6cb9dc54f8-m7w2q" (UID: "cdec1b8b-630a-452a-b4d9-3cd42ef204c7") : secret "webhook-server-cert" not found Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.824863 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-278qb\" (UniqueName: \"kubernetes.io/projected/238f517b-0e10-411c-8b3c-c6bdbe261159-kube-api-access-278qb\") pod \"test-operator-controller-manager-8464cf66df-chpfj\" (UID: \"238f517b-0e10-411c-8b3c-c6bdbe261159\") " pod="openstack-operators/test-operator-controller-manager-8464cf66df-chpfj" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.825856 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdx6z\" (UniqueName: \"kubernetes.io/projected/cdec1b8b-630a-452a-b4d9-3cd42ef204c7-kube-api-access-wdx6z\") pod \"openstack-operator-controller-manager-6cb9dc54f8-m7w2q\" (UID: \"cdec1b8b-630a-452a-b4d9-3cd42ef204c7\") " pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.827494 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhhkt\" (UniqueName: \"kubernetes.io/projected/d5e12170-5cc0-4f8f-89d7-c64f38f2226e-kube-api-access-mhhkt\") pod \"watcher-operator-controller-manager-7cd4fb6f79-qhzw4\" (UID: \"d5e12170-5cc0-4f8f-89d7-c64f38f2226e\") " pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.829619 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-799cb6ffd6-wck8j" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.899130 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.911510 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfbhk\" (UniqueName: \"kubernetes.io/projected/00f5e4f8-193c-48df-b29f-8f359f263a5a-kube-api-access-dfbhk\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-g64g6\" (UID: \"00f5e4f8-193c-48df-b29f-8f359f263a5a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g64g6" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.929337 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfbhk\" (UniqueName: \"kubernetes.io/projected/00f5e4f8-193c-48df-b29f-8f359f263a5a-kube-api-access-dfbhk\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-g64g6\" (UID: \"00f5e4f8-193c-48df-b29f-8f359f263a5a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g64g6" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.944015 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8464cf66df-chpfj" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.968910 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" Nov 24 09:16:14 crc kubenswrapper[4563]: I1124 09:16:14.982584 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d8fd67bf7-jnx9f"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.013068 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/974a1619-7c48-46d6-b639-5f965c6b747a-cert\") pod \"openstack-baremetal-operator-controller-manager-79d88dcd444qmtr\" (UID: \"974a1619-7c48-46d6-b639-5f965c6b747a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.017347 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/974a1619-7c48-46d6-b639-5f965c6b747a-cert\") pod \"openstack-baremetal-operator-controller-manager-79d88dcd444qmtr\" (UID: \"974a1619-7c48-46d6-b639-5f965c6b747a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.037274 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7768f8c84f-glf4s" event={"ID":"f81c148e-bf8e-4b57-895e-f2c11411cf7a","Type":"ContainerStarted","Data":"2d11505c041e2a1741bc1ee1153c12ac384a277acd80f69da861ce88a631a84c"} Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.040344 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.040899 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d8fd67bf7-jnx9f" event={"ID":"a62a6523-e592-437f-b3ba-320e24f619dc","Type":"ContainerStarted","Data":"1b3e661b6bf78fb8d43d4b2dde0baf27d715050254a6b27ca99a6cbda13ec3e8"} Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.040954 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.059848 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.131201 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g64g6" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.291662 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5889cddd94-gx6ft"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.292678 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.295101 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.295273 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.295603 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.295980 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.296730 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.297448 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.305886 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.307946 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.314130 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66d44fd8b6-2vgq6"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.319272 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5889cddd94-gx6ft"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.322329 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45111afc-fb32-4938-ae09-118dc9b31c06-serving-cert\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.322419 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45111afc-fb32-4938-ae09-118dc9b31c06-client-ca\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.322509 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdec1b8b-630a-452a-b4d9-3cd42ef204c7-cert\") pod \"openstack-operator-controller-manager-6cb9dc54f8-m7w2q\" (UID: \"cdec1b8b-630a-452a-b4d9-3cd42ef204c7\") " pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.322606 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkcrj\" (UniqueName: \"kubernetes.io/projected/45111afc-fb32-4938-ae09-118dc9b31c06-kube-api-access-hkcrj\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.322697 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45111afc-fb32-4938-ae09-118dc9b31c06-proxy-ca-bundles\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.322793 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45111afc-fb32-4938-ae09-118dc9b31c06-config\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.326982 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.330259 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57f4b94764-cp6cn"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.332126 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdec1b8b-630a-452a-b4d9-3cd42ef204c7-cert\") pod \"openstack-operator-controller-manager-6cb9dc54f8-m7w2q\" (UID: \"cdec1b8b-630a-452a-b4d9-3cd42ef204c7\") " pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.345389 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-bf4c6585d-tnxst"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.355628 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.360378 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d86b44686-4x76m"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.363944 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8667fbf6f6-k9wzp"] Nov 24 09:16:15 crc kubenswrapper[4563]: W1124 09:16:15.364685 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70a63634_9a9f_46b3_af05_9dc02c0a03e1.slice/crio-2b638cdf67addfe80d12a573de30432a7be54a5a896ead72b3d9316266070dbb WatchSource:0}: Error finding container 2b638cdf67addfe80d12a573de30432a7be54a5a896ead72b3d9316266070dbb: Status 404 returned error can't find the container with id 2b638cdf67addfe80d12a573de30432a7be54a5a896ead72b3d9316266070dbb Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.424350 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45111afc-fb32-4938-ae09-118dc9b31c06-proxy-ca-bundles\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.424390 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkcrj\" (UniqueName: \"kubernetes.io/projected/45111afc-fb32-4938-ae09-118dc9b31c06-kube-api-access-hkcrj\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.424463 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45111afc-fb32-4938-ae09-118dc9b31c06-config\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.424496 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45111afc-fb32-4938-ae09-118dc9b31c06-serving-cert\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.424520 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45111afc-fb32-4938-ae09-118dc9b31c06-client-ca\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.425398 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45111afc-fb32-4938-ae09-118dc9b31c06-proxy-ca-bundles\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.425412 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45111afc-fb32-4938-ae09-118dc9b31c06-client-ca\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.426745 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45111afc-fb32-4938-ae09-118dc9b31c06-config\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.428685 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45111afc-fb32-4938-ae09-118dc9b31c06-serving-cert\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.439280 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkcrj\" (UniqueName: \"kubernetes.io/projected/45111afc-fb32-4938-ae09-118dc9b31c06-kube-api-access-hkcrj\") pod \"controller-manager-5889cddd94-gx6ft\" (UID: \"45111afc-fb32-4938-ae09-118dc9b31c06\") " pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.547712 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7879fb76fd-4tv9l"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.552797 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6fdc856c5d-h78s9"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.558111 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7bb88cb858-44jfn"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.618796 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.632193 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.740222 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-56dfb6b67f-77wgb"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.742706 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-799cb6ffd6-wck8j"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.751313 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-6dc664666c-6flr8"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.758990 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5c75d7c94b-ltqbl"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.762987 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh"] Nov 24 09:16:15 crc kubenswrapper[4563]: W1124 09:16:15.764550 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31e8d237_829e_47b0_8a2c_8e316a37dc78.slice/crio-34042c5855858f187520a118c30acf38a682046ae0d5703dc99d3c96df5ffdfe WatchSource:0}: Error finding container 34042c5855858f187520a118c30acf38a682046ae0d5703dc99d3c96df5ffdfe: Status 404 returned error can't find the container with id 34042c5855858f187520a118c30acf38a682046ae0d5703dc99d3c96df5ffdfe Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.770079 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-86d796d84d-vkltr"] Nov 24 09:16:15 crc kubenswrapper[4563]: W1124 09:16:15.771809 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26aa13a3_737a_457f_9d46_29018cfccd1e.slice/crio-e1da724ce5e0ca6ca11511d732574b88f413d46c1e6c00cf81fab81c0b886a6d WatchSource:0}: Error finding container e1da724ce5e0ca6ca11511d732574b88f413d46c1e6c00cf81fab81c0b886a6d: Status 404 returned error can't find the container with id e1da724ce5e0ca6ca11511d732574b88f413d46c1e6c00cf81fab81c0b886a6d Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.797874 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.809965 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8464cf66df-chpfj"] Nov 24 09:16:15 crc kubenswrapper[4563]: W1124 09:16:15.814383 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d2aa6d4_db94_44dd_99f1_6e95e8de9a5f.slice/crio-99ae1d87bef2552d5c40cfd6f8b2c8a13aea456ed0b78e748555f436e553df1a WatchSource:0}: Error finding container 99ae1d87bef2552d5c40cfd6f8b2c8a13aea456ed0b78e748555f436e553df1a: Status 404 returned error can't find the container with id 99ae1d87bef2552d5c40cfd6f8b2c8a13aea456ed0b78e748555f436e553df1a Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.815609 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm"] Nov 24 09:16:15 crc kubenswrapper[4563]: W1124 09:16:15.817724 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffcb9e74_1697_402a_b77b_5a3ecc832759.slice/crio-d63435fd07f2f4d48dc9ae0c8a84f00c3b4adb78e66b869621d8f2108efec6d0 WatchSource:0}: Error finding container d63435fd07f2f4d48dc9ae0c8a84f00c3b4adb78e66b869621d8f2108efec6d0: Status 404 returned error can't find the container with id d63435fd07f2f4d48dc9ae0c8a84f00c3b4adb78e66b869621d8f2108efec6d0 Nov 24 09:16:15 crc kubenswrapper[4563]: E1124 09:16:15.816972 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:5d49d4594c66eda7b151746cc6e1d3c67c0129b4503eeb043a64ae8ec2da6a1b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cwpv2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-5bdf4f7f7f-6n5jh_openstack-operators(9fb1ddc7-1195-412e-93ed-4799bc756bae): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 24 09:16:15 crc kubenswrapper[4563]: E1124 09:16:15.819108 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-89skb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7798859c74-z5b6f_openstack-operators(9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.819544 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f"] Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.823599 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g64g6"] Nov 24 09:16:15 crc kubenswrapper[4563]: E1124 09:16:15.824050 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:207578cb433471cc1a79c21a808c8a15489d1d3c9fa77e29f3f697c33917fec6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m8k6q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-66b7d6f598-fffcm_openstack-operators(ffcb9e74-1697-402a-b77b-5a3ecc832759): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 24 09:16:15 crc kubenswrapper[4563]: W1124 09:16:15.826600 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod974a1619_7c48_46d6_b639_5f965c6b747a.slice/crio-fc873657a1fb1f883dfdc8b569ee23063d589ffaa01e644aa8b0c6edf05b96d8 WatchSource:0}: Error finding container fc873657a1fb1f883dfdc8b569ee23063d589ffaa01e644aa8b0c6edf05b96d8: Status 404 returned error can't find the container with id fc873657a1fb1f883dfdc8b569ee23063d589ffaa01e644aa8b0c6edf05b96d8 Nov 24 09:16:15 crc kubenswrapper[4563]: E1124 09:16:15.828402 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-278qb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-8464cf66df-chpfj_openstack-operators(238f517b-0e10-411c-8b3c-c6bdbe261159): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.841714 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr"] Nov 24 09:16:15 crc kubenswrapper[4563]: E1124 09:16:15.844065 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:7dbadf7b98f2f305f9f1382f55a084c8ca404f4263f76b28e56bd0dc437e2192,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:0473ff9eec0da231e2d0a10bf1abbe1dfa1a0f95b8f619e3a07605386951449a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:c8101c77a82eae4407e41e1fd766dfc6e1b7f9ed1679e3efb6f91ff97a1557b2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:eb9743b21bbadca6f7cb9ac4fc46b5d58c51c674073c7e1121f4474a71304071,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:3d81f839b98c2e2a5bf0da79f2f9a92dff7d0a3c5a830b0e95c89dad8cf98a6a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:d19ac99249b47dd8ea16cd6aaa5756346aa8a2f119ee50819c15c5366efb417d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:8536169e5537fe6c330eba814248abdcf39cdd8f7e7336034d74e6fda9544050,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:4c93a5cccb9971e24f05daf93b3aa11ba71752bc3469a1a1a2c4906f92f69645,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:4f1fa337760e82bfd67cdd142a97c121146dd7e621daac161940dd5e4ddb80dc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:3613b345d5baed98effd906f8b0242d863e14c97078ea473ef01fe1b0afc46f3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:d375d370be5ead0dac71109af644849e5795f535f9ad8eeacea261d77ae6f140,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:b73ad22b4955b06d584bce81742556d8c0c7828c495494f8ea7c99391c61b70f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:aa1d3aaf6b394621ed4089a98e0a82b763f467e8b5c5db772f9fdf99fc86e333,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:37d64e0a00c54e71a4c1fcbbbf7e832f6886ffd03c9a02b6ee3ca48fabc30879,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:d6661053141b6df421288a7c9968a155ab82e478c1d75ab41f2cebe2f0ca02d2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:ce2d63258cb4e7d0d1c07234de6889c5434464190906798019311a1c7cf6387f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:0485ef9e5b4437f7cd2ba54034a87722ce4669ee86b3773c6b0c037ed8000e91,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api@sha256:962c004551d0503779364b767b9bf0cecdf78dbba8809b2ca8b073f58e1f4e5d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor@sha256:0ebf4c465fb6cc7dad9e6cb2da0ff54874c9acbcb40d62234a629ec2c12cdd62,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:ff0c553ceeb2e0f44b010e37dc6d0db8a251797b88e56468b7cf7f05253e4232,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:624f553f073af7493d34828b074adc9981cce403edd8e71482c7307008479fd9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:e3874936a518c8560339db8f840fc5461885819f6050b5de8d3ab9199bea5094,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:1cea25f1d2a45affc80c46fb9d427749d3f06b61590ac6070a2910e3ec8a4e5d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:e36d5b9a65194f12f7b01c6422ba3ed52a687fd1695fbb21f4986c67d9f9317f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:8b21bec527d54cd766e277889df6bcccd2baeaa946274606b986c0c3b7ca689f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:45aceca77f8fcf61127f0da650bdfdf11ede9b0944c78b63fab819d03283f96b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:709ac58998927dd61786821ae1e63343fd97ccf5763aac5edb4583eea9401d22,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:867d4ef7c21f75e6030a685b5762ab4d84b671316ed6b98d75200076e93342cd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:2b90da93550b99d2fcfa95bd819f3363aa68346a416f8dc7baac3e9c5f487761,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:8cde52cef8795d1c91983b100d86541c7718160ec260fe0f97b96add4c2c8ee8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:835ebed082fe1c45bd799d1d5357595ce63efeb05ca876f26b08443facb9c164,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:011d682241db724bc40736c9b54d2ea450ea7e6be095b1ff5fa28c8007466775,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:2025da90cff8f563deb08bee71efe16d4078edc2a767b2e225cca5c77f1aa2f9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:26bd7b0bd6070856aefef6fe754c547d55c056396ea30d879d34c2d49b5a1d29,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:ff46cd5e0e13d105c4629e78c2734a50835f06b6a1e31da9e0462981d10c4be3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:5b4fd0c2b76fa5539f74687b11c5882d77bd31352452322b37ff51fa18f12a61,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:5e03376bd895346dc8f627ca15ded942526ed8b5e92872f453ce272e694d18d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:36a0fb31978aee0ded2483de311631e64a644d0b0685b5b055f65ede7eb8e8a2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:5f6045841aff0fde6f684a34cdf49f8dc7b2c3bcbdeab201f1058971e0c5f79e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:448f4e1b740c30936e340bd6e8534d78c83357bf373a4223950aa64d3484f007,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:b68e3615af8a0eb0ef6bf9ceeef59540a6f4a9a85f6078a3620be115c73a7db8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:7eae01cf60383e523c9cd94d158a9162120a7370829a1dad20fdea6b0fd660bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:28cc10501788081eb61b5a1af35546191a92741f4f109df54c74e2b19439d0f9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:9a616e37acfd120612f78043237a8541266ba34883833c9beb43f3da313661ad,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:6b1be6cd94a0942259bca5d5d2c30cc7de4a33276b61f8ae3940226772106256,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:02d2c22d15401574941fbe057095442dee0d6f7a0a9341de35d25e6a12a3fe4b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:fc3b3a36b74fd653946723c54b208072d52200635850b531e9d595a7aaea5a01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:7850ccbff320bf9a1c9c769c1c70777eb97117dd8cd5ae4435be9b4622cf807a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:397dac7e39cf40d14a986e6ec4a60fb698ca35c197d0db315b1318514cc6d1d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:1c95142a36276686e720f86423ee171dc9adcc1e89879f627545b7c906ccd9bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:e331a8fde6638e5ba154c4f0b38772a9a424f60656f2777245975fb1fa02f07d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:cd3cf7a34053e850b4d4f9f4ea4c74953a54a42fd18e47d7c01d44a88923e925,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:aee28476344fc0cc148fbe97daf9b1bfcedc22001550bba4bdc4e84be7b6989d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:cfa0b92c976603ee2a937d34013a238fcd8aa75f998e50642e33489f14124633,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:73c2f2d6eecf88acf4e45b133c8373d9bb006b530e0aff0b28f3b7420620a874,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:927b405cc04abe5ff716186e8d35e2dc5fad1c8430194659ee6617d74e4e055d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:6154d7cebd7c339afa5b86330262156171743aa5b79c2b78f9a2f378005ed8fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:e2db2f4af8d3d0be7868c6efef0189f3a2c74a8f96ae10e3f991cdf83feaef29,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:c773629df257726a6d3cacc24a6e4df0babcd7d37df04e6d14676a8da028b9c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:776211111e2e6493706dbc49a3ba44f31d1b947919313ed3a0f35810e304ec52,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:0a98e8f5c83522ca6c8e40c5e9561f6628d2d5e69f0e8a64279c541c989d3d8b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:7cccf24ad0a152f90ca39893064f48a1656950ee8142685a5d482c71f0bdc9f5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:05450b48f6b5352b2686a26e933e8727748edae2ae9652d9164b7d7a1817c55a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:fc9c99eeef91523482bd8f92661b393287e1f2a24ad2ba9e33191f8de9af74cf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:3e4ecc02b4b5e0860482a93599ba9ca598c5ce26c093c46e701f96fe51acb208,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:2346037e064861c7892690d2e8b3e1eea1a26ce3c3a11fda0b41301965bc828c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:7dd2e0dbb6bb5a6cecd1763e43479ca8cb6a0c502534e83c8795c0da2b50e099,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:95d67f51dfedd5bd3ec785b488425295b2d8c41feae3e6386ef471615381809b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:c26c3ff9cabe3593ceb10006e782bf9391ac14785768ce9eec4f938c2d3cf228,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:273fe8c27d08d0f62773a02f8cef6a761a7768116ee1a4be611f93bbf63f2b75,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:daa45220bb1c47922d0917aa8fe423bb82b03a01429f1c9e37635e701e352d71,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:a80a074e227d3238bb6f285788a9e886ae7a5909ccbc5c19c93c369bdfe5b3b8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:58ac66ca1be01fe0157977bd79a26cde4d0de153edfaf4162367c924826b2ef4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:99a63770d80cc7c3afa1118b400972fb0e6bff5284a2eae781b12582ad79c29c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:9ee4d84529394afcd860f1a1186484560f02f08c15c37cac42a22473b7116d5f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:ea15fadda7b0439ec637edfaf6ea5dbf3e35fb3be012c7c5a31e722c90becb11,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8vptp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-79d88dcd444qmtr_openstack-operators(974a1619-7c48-46d6-b639-5f965c6b747a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.860144 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4"] Nov 24 09:16:15 crc kubenswrapper[4563]: E1124 09:16:15.869876 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mhhkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7cd4fb6f79-qhzw4_openstack-operators(d5e12170-5cc0-4f8f-89d7-c64f38f2226e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 24 09:16:15 crc kubenswrapper[4563]: I1124 09:16:15.883180 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5889cddd94-gx6ft"] Nov 24 09:16:15 crc kubenswrapper[4563]: E1124 09:16:15.884463 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-62pfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6f8c5b86cb-94tjk_openstack-operators(a30aea9a-f4c8-42a3-89bb-af9ffef55544): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 24 09:16:15 crc kubenswrapper[4563]: E1124 09:16:15.885219 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dfbhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-g64g6_openstack-operators(00f5e4f8-193c-48df-b29f-8f359f263a5a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 24 09:16:15 crc kubenswrapper[4563]: E1124 09:16:15.886544 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g64g6" podUID="00f5e4f8-193c-48df-b29f-8f359f263a5a" Nov 24 09:16:15 crc kubenswrapper[4563]: E1124 09:16:15.981229 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh" podUID="9fb1ddc7-1195-412e-93ed-4799bc756bae" Nov 24 09:16:16 crc kubenswrapper[4563]: E1124 09:16:16.012369 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" podUID="974a1619-7c48-46d6-b639-5f965c6b747a" Nov 24 09:16:16 crc kubenswrapper[4563]: E1124 09:16:16.027081 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-8464cf66df-chpfj" podUID="238f517b-0e10-411c-8b3c-c6bdbe261159" Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.060493 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7bb88cb858-44jfn" event={"ID":"13a3e7f4-4c3d-46e7-a7ee-612f9ac17bc7","Type":"ContainerStarted","Data":"e016cff61b54a95daa6a959d9cc59deb5e17e5e9f2f4f047407f85c382605675"} Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.063108 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh" event={"ID":"9fb1ddc7-1195-412e-93ed-4799bc756bae","Type":"ContainerStarted","Data":"40b3e9af32f2be09445cdd81f7e03fb59c7202ba85a7e58d5a86b7e09c3617ef"} Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.063146 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh" event={"ID":"9fb1ddc7-1195-412e-93ed-4799bc756bae","Type":"ContainerStarted","Data":"817d666ff21cfc7f0f25cf1bc78cfc88e4cb761a853da92e1eb2be3252afe3f3"} Nov 24 09:16:16 crc kubenswrapper[4563]: E1124 09:16:16.064915 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5d49d4594c66eda7b151746cc6e1d3c67c0129b4503eeb043a64ae8ec2da6a1b\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh" podUID="9fb1ddc7-1195-412e-93ed-4799bc756bae" Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.065524 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6fdc856c5d-h78s9" event={"ID":"71d78263-9c76-454f-8b9f-1392c9fcfc2f","Type":"ContainerStarted","Data":"cb2e723cad1b5564c91b61b0bec115378e0b1b0fe7c79c8bfc51c9da8ebb9eb9"} Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.068236 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" event={"ID":"974a1619-7c48-46d6-b639-5f965c6b747a","Type":"ContainerStarted","Data":"952c5e4e3cb85531a4453f397fcac3eb946623371a67df2e5c68d1e3e616adb8"} Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.068290 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" event={"ID":"974a1619-7c48-46d6-b639-5f965c6b747a","Type":"ContainerStarted","Data":"fc873657a1fb1f883dfdc8b569ee23063d589ffaa01e644aa8b0c6edf05b96d8"} Nov 24 09:16:16 crc kubenswrapper[4563]: E1124 09:16:16.069609 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" podUID="974a1619-7c48-46d6-b639-5f965c6b747a" Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.090989 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d86b44686-4x76m" event={"ID":"70a63634-9a9f-46b3-af05-9dc02c0a03e1","Type":"ContainerStarted","Data":"2b638cdf67addfe80d12a573de30432a7be54a5a896ead72b3d9316266070dbb"} Nov 24 09:16:16 crc kubenswrapper[4563]: E1124 09:16:16.096833 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" podUID="ffcb9e74-1697-402a-b77b-5a3ecc832759" Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.098916 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8667fbf6f6-k9wzp" event={"ID":"77d539d7-5235-4576-a276-8247c5824020","Type":"ContainerStarted","Data":"4c0149168bcb923bf3290690d4c09be14380e65d4eb8369dbf1eecb8704148e6"} Nov 24 09:16:16 crc kubenswrapper[4563]: E1124 09:16:16.108759 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" podUID="9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f" Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.110200 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7879fb76fd-4tv9l" event={"ID":"ebed0d67-0bac-4d1f-a2d0-2e367d78d157","Type":"ContainerStarted","Data":"c00feb1e3e8e0142379b320342bc56a2ab73c01044451f5b6893ff4959c98f8f"} Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.118829 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-56dfb6b67f-77wgb" event={"ID":"17904228-d0e5-489c-a965-5cba44f3b3f2","Type":"ContainerStarted","Data":"c6425e114392e71e70792ac6ea88c2eee264a8e2232ad1adf76817ee90467814"} Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.123147 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq" event={"ID":"68eeb4a0-b192-4e6a-b02b-f34415b29316","Type":"ContainerStarted","Data":"85bba01aebe75482e3283ade43985418da8d93b99b609271fcad3b74344d6d58"} Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.130780 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-86d796d84d-vkltr" event={"ID":"c089c738-65b8-46e2-91c9-59b962081c05","Type":"ContainerStarted","Data":"bb5148d7ff158cd6bfb38ed9829143ea6b984def52cbc84745b4f7194033e18d"} Nov 24 09:16:16 crc kubenswrapper[4563]: E1124 09:16:16.134513 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" podUID="d5e12170-5cc0-4f8f-89d7-c64f38f2226e" Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.142959 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-bf4c6585d-tnxst" event={"ID":"b4f4311c-5634-4bae-8659-5efa662f0562","Type":"ContainerStarted","Data":"ee3825b2dcd854932a700f9c8b8d032bf458a381832f698ce6206598e7d6763e"} Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.145903 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8464cf66df-chpfj" event={"ID":"238f517b-0e10-411c-8b3c-c6bdbe261159","Type":"ContainerStarted","Data":"2b4e73b3f785950fd9abdf26f2864ef49e8de3f233a16f9b5534ae38e6513826"} Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.145997 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8464cf66df-chpfj" event={"ID":"238f517b-0e10-411c-8b3c-c6bdbe261159","Type":"ContainerStarted","Data":"50c61be7088fcfcb7f92dd674a1a7bd79ff78412f0b0bb9deae9efdd35cf6cd6"} Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.147176 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" event={"ID":"45111afc-fb32-4938-ae09-118dc9b31c06","Type":"ContainerStarted","Data":"d507a31af78bc42b739f8c8803ba4629a3cfe6fccc099a1ec11e4cfe76a80194"} Nov 24 09:16:16 crc kubenswrapper[4563]: E1124 09:16:16.148311 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\"" pod="openstack-operators/test-operator-controller-manager-8464cf66df-chpfj" podUID="238f517b-0e10-411c-8b3c-c6bdbe261159" Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.149515 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" event={"ID":"d5e12170-5cc0-4f8f-89d7-c64f38f2226e","Type":"ContainerStarted","Data":"c7a405e8a19c8357f8930b1fa407dc10c4c3709c76550f3266457071cf4760be"} Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.151391 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-799cb6ffd6-wck8j" event={"ID":"31e8d237-829e-47b0-8a2c-8e316a37dc78","Type":"ContainerStarted","Data":"34042c5855858f187520a118c30acf38a682046ae0d5703dc99d3c96df5ffdfe"} Nov 24 09:16:16 crc kubenswrapper[4563]: E1124 09:16:16.151737 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" podUID="d5e12170-5cc0-4f8f-89d7-c64f38f2226e" Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.166353 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q"] Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.171747 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" event={"ID":"ffcb9e74-1697-402a-b77b-5a3ecc832759","Type":"ContainerStarted","Data":"d63435fd07f2f4d48dc9ae0c8a84f00c3b4adb78e66b869621d8f2108efec6d0"} Nov 24 09:16:16 crc kubenswrapper[4563]: W1124 09:16:16.187928 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdec1b8b_630a_452a_b4d9_3cd42ef204c7.slice/crio-6e34e25ac7b001b6869e73c1a527df4f1bc600a837a5b94789fab106964345a5 WatchSource:0}: Error finding container 6e34e25ac7b001b6869e73c1a527df4f1bc600a837a5b94789fab106964345a5: Status 404 returned error can't find the container with id 6e34e25ac7b001b6869e73c1a527df4f1bc600a837a5b94789fab106964345a5 Nov 24 09:16:16 crc kubenswrapper[4563]: E1124 09:16:16.188040 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:207578cb433471cc1a79c21a808c8a15489d1d3c9fa77e29f3f697c33917fec6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" podUID="ffcb9e74-1697-402a-b77b-5a3ecc832759" Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.193905 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk" event={"ID":"a30aea9a-f4c8-42a3-89bb-af9ffef55544","Type":"ContainerStarted","Data":"787b032f767ea1c9b0fb7912aacd059c43b85aa8baac103398902091d790d9f9"} Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.205521 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5c75d7c94b-ltqbl" event={"ID":"26aa13a3-737a-457f-9d46-29018cfccd1e","Type":"ContainerStarted","Data":"e1da724ce5e0ca6ca11511d732574b88f413d46c1e6c00cf81fab81c0b886a6d"} Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.207119 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g64g6" event={"ID":"00f5e4f8-193c-48df-b29f-8f359f263a5a","Type":"ContainerStarted","Data":"fdad178df5db45eef35cb616359940161ca9a912847bd48df7c0de12331efe4c"} Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.208757 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6dc664666c-6flr8" event={"ID":"6a018387-ddf9-40f3-a421-d1a760581c8f","Type":"ContainerStarted","Data":"68a8f922cdc2023f737b678db61661b0b2f7ce4fcdc37cfd8f56ceaa79ffc797"} Nov 24 09:16:16 crc kubenswrapper[4563]: E1124 09:16:16.209165 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g64g6" podUID="00f5e4f8-193c-48df-b29f-8f359f263a5a" Nov 24 09:16:16 crc kubenswrapper[4563]: I1124 09:16:16.223773 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" event={"ID":"9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f","Type":"ContainerStarted","Data":"99ae1d87bef2552d5c40cfd6f8b2c8a13aea456ed0b78e748555f436e553df1a"} Nov 24 09:16:16 crc kubenswrapper[4563]: E1124 09:16:16.224027 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk" podUID="a30aea9a-f4c8-42a3-89bb-af9ffef55544" Nov 24 09:16:16 crc kubenswrapper[4563]: E1124 09:16:16.229041 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" podUID="9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.068202 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5284336-a335-4d2f-a960-d133a6b32dc6" path="/var/lib/kubelet/pods/b5284336-a335-4d2f-a960-d133a6b32dc6/volumes" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.068628 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2ebd330-0f96-4619-992b-d73bddd5ca58" path="/var/lib/kubelet/pods/d2ebd330-0f96-4619-992b-d73bddd5ca58/volumes" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.246956 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk" event={"ID":"a30aea9a-f4c8-42a3-89bb-af9ffef55544","Type":"ContainerStarted","Data":"5a7b69355e9a12c993f92fc028b6ab887295b9d4ece33113cd47ea2541d8650d"} Nov 24 09:16:17 crc kubenswrapper[4563]: E1124 09:16:17.255196 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk" podUID="a30aea9a-f4c8-42a3-89bb-af9ffef55544" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.260990 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" event={"ID":"cdec1b8b-630a-452a-b4d9-3cd42ef204c7","Type":"ContainerStarted","Data":"915236605d0093da3590633e60b38d5bc0501dfe6af743429dac92008cba44ed"} Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.261028 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" event={"ID":"cdec1b8b-630a-452a-b4d9-3cd42ef204c7","Type":"ContainerStarted","Data":"ced9c9828147cec74ad285f281807129b1fe2313bb8e8642f39698bd25328c52"} Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.261054 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" event={"ID":"cdec1b8b-630a-452a-b4d9-3cd42ef204c7","Type":"ContainerStarted","Data":"6e34e25ac7b001b6869e73c1a527df4f1bc600a837a5b94789fab106964345a5"} Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.262172 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.279066 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" event={"ID":"9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f","Type":"ContainerStarted","Data":"2c167cbbf36690bbf8bd9c5e2f442bacb9c3e10a42e5e242444c1151eceffe5f"} Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.282035 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7"] Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.283713 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.290980 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7"] Nov 24 09:16:17 crc kubenswrapper[4563]: E1124 09:16:17.291280 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" podUID="9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.292476 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.292897 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.293150 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" event={"ID":"45111afc-fb32-4938-ae09-118dc9b31c06","Type":"ContainerStarted","Data":"b12e01e0778960553523d078ca012e431525007f5257809b2f1112d28c472883"} Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.294540 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.297628 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" event={"ID":"d5e12170-5cc0-4f8f-89d7-c64f38f2226e","Type":"ContainerStarted","Data":"2e77ff110544d61d63b56b07e0b8f9d76665c67611b0fe9f061ff75afbbfda7e"} Nov 24 09:16:17 crc kubenswrapper[4563]: E1124 09:16:17.299379 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" podUID="d5e12170-5cc0-4f8f-89d7-c64f38f2226e" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.301065 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.301192 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.301660 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.301678 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.302749 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.303887 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" podStartSLOduration=3.303874573 podStartE2EDuration="3.303874573s" podCreationTimestamp="2025-11-24 09:16:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:16:17.301956907 +0000 UTC m=+754.560934355" watchObservedRunningTime="2025-11-24 09:16:17.303874573 +0000 UTC m=+754.562852020" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.307143 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" event={"ID":"ffcb9e74-1697-402a-b77b-5a3ecc832759","Type":"ContainerStarted","Data":"5f29515222985b19e4f65ec7cd95082366252034b12d9c323697993bf752a401"} Nov 24 09:16:17 crc kubenswrapper[4563]: E1124 09:16:17.308891 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g64g6" podUID="00f5e4f8-193c-48df-b29f-8f359f263a5a" Nov 24 09:16:17 crc kubenswrapper[4563]: E1124 09:16:17.308961 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:207578cb433471cc1a79c21a808c8a15489d1d3c9fa77e29f3f697c33917fec6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" podUID="ffcb9e74-1697-402a-b77b-5a3ecc832759" Nov 24 09:16:17 crc kubenswrapper[4563]: E1124 09:16:17.309064 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" podUID="974a1619-7c48-46d6-b639-5f965c6b747a" Nov 24 09:16:17 crc kubenswrapper[4563]: E1124 09:16:17.309107 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5d49d4594c66eda7b151746cc6e1d3c67c0129b4503eeb043a64ae8ec2da6a1b\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh" podUID="9fb1ddc7-1195-412e-93ed-4799bc756bae" Nov 24 09:16:17 crc kubenswrapper[4563]: E1124 09:16:17.309320 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\"" pod="openstack-operators/test-operator-controller-manager-8464cf66df-chpfj" podUID="238f517b-0e10-411c-8b3c-c6bdbe261159" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.342666 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5889cddd94-gx6ft" podStartSLOduration=4.342620741 podStartE2EDuration="4.342620741s" podCreationTimestamp="2025-11-24 09:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:16:17.338855601 +0000 UTC m=+754.597833047" watchObservedRunningTime="2025-11-24 09:16:17.342620741 +0000 UTC m=+754.601598187" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.367908 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc0c91d3-a60a-4695-bacb-01ba85ba56e1-client-ca\") pod \"route-controller-manager-869d86c9c4-kbsk7\" (UID: \"bc0c91d3-a60a-4695-bacb-01ba85ba56e1\") " pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.368144 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0c91d3-a60a-4695-bacb-01ba85ba56e1-serving-cert\") pod \"route-controller-manager-869d86c9c4-kbsk7\" (UID: \"bc0c91d3-a60a-4695-bacb-01ba85ba56e1\") " pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.368234 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc0c91d3-a60a-4695-bacb-01ba85ba56e1-config\") pod \"route-controller-manager-869d86c9c4-kbsk7\" (UID: \"bc0c91d3-a60a-4695-bacb-01ba85ba56e1\") " pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.368281 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzrxq\" (UniqueName: \"kubernetes.io/projected/bc0c91d3-a60a-4695-bacb-01ba85ba56e1-kube-api-access-rzrxq\") pod \"route-controller-manager-869d86c9c4-kbsk7\" (UID: \"bc0c91d3-a60a-4695-bacb-01ba85ba56e1\") " pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.469406 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0c91d3-a60a-4695-bacb-01ba85ba56e1-serving-cert\") pod \"route-controller-manager-869d86c9c4-kbsk7\" (UID: \"bc0c91d3-a60a-4695-bacb-01ba85ba56e1\") " pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.469523 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc0c91d3-a60a-4695-bacb-01ba85ba56e1-config\") pod \"route-controller-manager-869d86c9c4-kbsk7\" (UID: \"bc0c91d3-a60a-4695-bacb-01ba85ba56e1\") " pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.469559 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzrxq\" (UniqueName: \"kubernetes.io/projected/bc0c91d3-a60a-4695-bacb-01ba85ba56e1-kube-api-access-rzrxq\") pod \"route-controller-manager-869d86c9c4-kbsk7\" (UID: \"bc0c91d3-a60a-4695-bacb-01ba85ba56e1\") " pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.469617 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc0c91d3-a60a-4695-bacb-01ba85ba56e1-client-ca\") pod \"route-controller-manager-869d86c9c4-kbsk7\" (UID: \"bc0c91d3-a60a-4695-bacb-01ba85ba56e1\") " pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.471179 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc0c91d3-a60a-4695-bacb-01ba85ba56e1-client-ca\") pod \"route-controller-manager-869d86c9c4-kbsk7\" (UID: \"bc0c91d3-a60a-4695-bacb-01ba85ba56e1\") " pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.471226 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc0c91d3-a60a-4695-bacb-01ba85ba56e1-config\") pod \"route-controller-manager-869d86c9c4-kbsk7\" (UID: \"bc0c91d3-a60a-4695-bacb-01ba85ba56e1\") " pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.482404 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0c91d3-a60a-4695-bacb-01ba85ba56e1-serving-cert\") pod \"route-controller-manager-869d86c9c4-kbsk7\" (UID: \"bc0c91d3-a60a-4695-bacb-01ba85ba56e1\") " pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.484852 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzrxq\" (UniqueName: \"kubernetes.io/projected/bc0c91d3-a60a-4695-bacb-01ba85ba56e1-kube-api-access-rzrxq\") pod \"route-controller-manager-869d86c9c4-kbsk7\" (UID: \"bc0c91d3-a60a-4695-bacb-01ba85ba56e1\") " pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:17 crc kubenswrapper[4563]: I1124 09:16:17.622893 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:18 crc kubenswrapper[4563]: I1124 09:16:18.038946 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7"] Nov 24 09:16:18 crc kubenswrapper[4563]: E1124 09:16:18.317470 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:207578cb433471cc1a79c21a808c8a15489d1d3c9fa77e29f3f697c33917fec6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" podUID="ffcb9e74-1697-402a-b77b-5a3ecc832759" Nov 24 09:16:18 crc kubenswrapper[4563]: E1124 09:16:18.317493 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk" podUID="a30aea9a-f4c8-42a3-89bb-af9ffef55544" Nov 24 09:16:18 crc kubenswrapper[4563]: E1124 09:16:18.317468 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" podUID="9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f" Nov 24 09:16:18 crc kubenswrapper[4563]: E1124 09:16:18.317554 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" podUID="d5e12170-5cc0-4f8f-89d7-c64f38f2226e" Nov 24 09:16:19 crc kubenswrapper[4563]: I1124 09:16:19.320734 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" event={"ID":"bc0c91d3-a60a-4695-bacb-01ba85ba56e1","Type":"ContainerStarted","Data":"2e8d3f0acfb78aa657d099939c5d8171c79d32e082cb0088077ee630076c8d6e"} Nov 24 09:16:20 crc kubenswrapper[4563]: I1124 09:16:20.564015 4563 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 24 09:16:24 crc kubenswrapper[4563]: I1124 09:16:24.365182 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6dc664666c-6flr8" event={"ID":"6a018387-ddf9-40f3-a421-d1a760581c8f","Type":"ContainerStarted","Data":"cbfdcf243339bafb7f60cb821f973fea4cc45efb23e917e0f0d94572c75b8b4e"} Nov 24 09:16:24 crc kubenswrapper[4563]: I1124 09:16:24.367353 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" event={"ID":"bc0c91d3-a60a-4695-bacb-01ba85ba56e1","Type":"ContainerStarted","Data":"d7101bf8f2d2d4db274e2648fefed66be250c81d1dc0f42d616bb2894cf9f134"} Nov 24 09:16:24 crc kubenswrapper[4563]: I1124 09:16:24.368588 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:24 crc kubenswrapper[4563]: I1124 09:16:24.373424 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d86b44686-4x76m" event={"ID":"70a63634-9a9f-46b3-af05-9dc02c0a03e1","Type":"ContainerStarted","Data":"d1331cb009cbf2af27e2d92caba27c9b272a82a16b7e3f23f6eea4e38a6172c5"} Nov 24 09:16:24 crc kubenswrapper[4563]: I1124 09:16:24.375768 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq" event={"ID":"68eeb4a0-b192-4e6a-b02b-f34415b29316","Type":"ContainerStarted","Data":"f63ffc6df0672b102a683c310a3631607310c24c7760f81855731a0529d79a17"} Nov 24 09:16:24 crc kubenswrapper[4563]: I1124 09:16:24.377774 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d8fd67bf7-jnx9f" event={"ID":"a62a6523-e592-437f-b3ba-320e24f619dc","Type":"ContainerStarted","Data":"7d558291923accdbac7ad592cf30a60d4fd0d535903919c1d81c5cf625e4594f"} Nov 24 09:16:24 crc kubenswrapper[4563]: I1124 09:16:24.379320 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8667fbf6f6-k9wzp" event={"ID":"77d539d7-5235-4576-a276-8247c5824020","Type":"ContainerStarted","Data":"faa79ad12403eb81498f9a4a831ece085173a3b0b65bd869cde1774fcc2af705"} Nov 24 09:16:24 crc kubenswrapper[4563]: I1124 09:16:24.381082 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6fdc856c5d-h78s9" event={"ID":"71d78263-9c76-454f-8b9f-1392c9fcfc2f","Type":"ContainerStarted","Data":"01e055f0f1dc84992aac72b951026637d0659d76ace62704f6615e15d4a2dd6f"} Nov 24 09:16:24 crc kubenswrapper[4563]: I1124 09:16:24.381723 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" Nov 24 09:16:24 crc kubenswrapper[4563]: I1124 09:16:24.382815 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5c75d7c94b-ltqbl" event={"ID":"26aa13a3-737a-457f-9d46-29018cfccd1e","Type":"ContainerStarted","Data":"a087e42667d796751f47098695c60be7c1c386a29ea2bcc96bc190731b9ff076"} Nov 24 09:16:24 crc kubenswrapper[4563]: I1124 09:16:24.383954 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7768f8c84f-glf4s" event={"ID":"f81c148e-bf8e-4b57-895e-f2c11411cf7a","Type":"ContainerStarted","Data":"7db61d932af1e6b9aa2ce269b16be1e22c0bde42fac256a921b91d21b3a791a8"} Nov 24 09:16:24 crc kubenswrapper[4563]: I1124 09:16:24.389094 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-799cb6ffd6-wck8j" event={"ID":"31e8d237-829e-47b0-8a2c-8e316a37dc78","Type":"ContainerStarted","Data":"f22e74beea36f30fb734d2e450292bf49bd20ec35557c068148662c3975c1ec8"} Nov 24 09:16:24 crc kubenswrapper[4563]: I1124 09:16:24.399601 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-869d86c9c4-kbsk7" podStartSLOduration=11.399585615 podStartE2EDuration="11.399585615s" podCreationTimestamp="2025-11-24 09:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:16:24.39312961 +0000 UTC m=+761.652107057" watchObservedRunningTime="2025-11-24 09:16:24.399585615 +0000 UTC m=+761.658563063" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.401983 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7768f8c84f-glf4s" event={"ID":"f81c148e-bf8e-4b57-895e-f2c11411cf7a","Type":"ContainerStarted","Data":"8402d7faaf4e4fcd2031771bfb09aec000f1a841fdeebc610fa7d3285a320d8b"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.402338 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7768f8c84f-glf4s" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.403916 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq" event={"ID":"68eeb4a0-b192-4e6a-b02b-f34415b29316","Type":"ContainerStarted","Data":"e34a5bc50f31e2c10e1c83360f5c5f9fb511700f4098e7483c140d37d67f69c6"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.404805 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.418929 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6fdc856c5d-h78s9" event={"ID":"71d78263-9c76-454f-8b9f-1392c9fcfc2f","Type":"ContainerStarted","Data":"035f7202a11f1e3d68b3505fb09246bb5aa09c88a1901270d1bbe32b30b00a1f"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.419064 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6fdc856c5d-h78s9" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.423694 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6dc664666c-6flr8" event={"ID":"6a018387-ddf9-40f3-a421-d1a760581c8f","Type":"ContainerStarted","Data":"6ca05cb42622af39e6658dc106030266f00634721a9c3175cc97f083a82ccb95"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.423818 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-6dc664666c-6flr8" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.425186 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-86d796d84d-vkltr" event={"ID":"c089c738-65b8-46e2-91c9-59b962081c05","Type":"ContainerStarted","Data":"84f7866b84aab5cbc0cb3f6ccc96d1aa9f5a683dc345135ebc94f702aed95de2"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.425223 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-86d796d84d-vkltr" event={"ID":"c089c738-65b8-46e2-91c9-59b962081c05","Type":"ContainerStarted","Data":"1d2fb08464dc8523782e391a74901c0539c6b1a3fc46dc1603e1b9b6e0a19a70"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.425287 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-86d796d84d-vkltr" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.428229 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-bf4c6585d-tnxst" event={"ID":"b4f4311c-5634-4bae-8659-5efa662f0562","Type":"ContainerStarted","Data":"133cc7fac5a424e058664f0037dcf133eb48a1b7af4e11fad7a7824f97d8a546"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.428282 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-bf4c6585d-tnxst" event={"ID":"b4f4311c-5634-4bae-8659-5efa662f0562","Type":"ContainerStarted","Data":"ff9ae4e2f39b5dd6ef8142ede5f87dfb0de35a67feef6a9863d05a923b1fdf0b"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.428303 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-bf4c6585d-tnxst" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.430080 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-799cb6ffd6-wck8j" event={"ID":"31e8d237-829e-47b0-8a2c-8e316a37dc78","Type":"ContainerStarted","Data":"c317302fc113d2d90af9cfe96bc9cc9ec521d406681f19e8e552cb024cfa392d"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.430221 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-799cb6ffd6-wck8j" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.431533 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d86b44686-4x76m" event={"ID":"70a63634-9a9f-46b3-af05-9dc02c0a03e1","Type":"ContainerStarted","Data":"846f029cc13712ec8cffb3d86a5be46b90381e60b83d3f8f8e00674765db7a5c"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.431600 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5d86b44686-4x76m" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.433979 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7bb88cb858-44jfn" event={"ID":"13a3e7f4-4c3d-46e7-a7ee-612f9ac17bc7","Type":"ContainerStarted","Data":"f48e8463e109d9c25533455feac569b9d425c3a59724bc6fe5dba065595ea6dc"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.434022 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7bb88cb858-44jfn" event={"ID":"13a3e7f4-4c3d-46e7-a7ee-612f9ac17bc7","Type":"ContainerStarted","Data":"b4094f768c6215b494ee62e8249d2cb0c67277ca80d2b69de0fa7f2c7fe59ec9"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.434100 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7bb88cb858-44jfn" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.435578 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7768f8c84f-glf4s" podStartSLOduration=3.349583614 podStartE2EDuration="12.435567216s" podCreationTimestamp="2025-11-24 09:16:13 +0000 UTC" firstStartedPulling="2025-11-24 09:16:14.802399494 +0000 UTC m=+752.061376941" lastFinishedPulling="2025-11-24 09:16:23.888383096 +0000 UTC m=+761.147360543" observedRunningTime="2025-11-24 09:16:25.428088472 +0000 UTC m=+762.687065920" watchObservedRunningTime="2025-11-24 09:16:25.435567216 +0000 UTC m=+762.694544663" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.436184 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8667fbf6f6-k9wzp" event={"ID":"77d539d7-5235-4576-a276-8247c5824020","Type":"ContainerStarted","Data":"161f50d1f75cfff2c0a129b71c2771249ec70db9d9b2a2b78d72cffcaadcefa6"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.436312 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8667fbf6f6-k9wzp" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.437725 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d8fd67bf7-jnx9f" event={"ID":"a62a6523-e592-437f-b3ba-320e24f619dc","Type":"ContainerStarted","Data":"707069c9fc7d4b277c1b460e9df7b4cba1cb3700cb989bb1085b1aebe4f0640b"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.438115 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6d8fd67bf7-jnx9f" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.440037 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5c75d7c94b-ltqbl" event={"ID":"26aa13a3-737a-457f-9d46-29018cfccd1e","Type":"ContainerStarted","Data":"09f1f6bd80d3121a547b969b3d14efcbc55f65c8cb05de31547426332509262d"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.440403 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5c75d7c94b-ltqbl" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.441917 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7879fb76fd-4tv9l" event={"ID":"ebed0d67-0bac-4d1f-a2d0-2e367d78d157","Type":"ContainerStarted","Data":"c4fe389c9727cdd02d15c71f73bd4d3c730876519c859e0ea9411e441f8477a1"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.442206 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7879fb76fd-4tv9l" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.443801 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-56dfb6b67f-77wgb" event={"ID":"17904228-d0e5-489c-a965-5cba44f3b3f2","Type":"ContainerStarted","Data":"f7434f4a82a0043a3d523ff766fee76c357e6541d0195d0058a70f96bd36fc82"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.443824 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-56dfb6b67f-77wgb" event={"ID":"17904228-d0e5-489c-a965-5cba44f3b3f2","Type":"ContainerStarted","Data":"a0762cd36ae04823aab5d7ed807a62d3baf2d34d33184e875084fe4f34c6ab3c"} Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.443837 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-56dfb6b67f-77wgb" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.449670 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6fdc856c5d-h78s9" podStartSLOduration=3.087619692 podStartE2EDuration="11.449660693s" podCreationTimestamp="2025-11-24 09:16:14 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.565933847 +0000 UTC m=+752.824911294" lastFinishedPulling="2025-11-24 09:16:23.927974848 +0000 UTC m=+761.186952295" observedRunningTime="2025-11-24 09:16:25.449540587 +0000 UTC m=+762.708518034" watchObservedRunningTime="2025-11-24 09:16:25.449660693 +0000 UTC m=+762.708638140" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.468605 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-86d796d84d-vkltr" podStartSLOduration=3.317230501 podStartE2EDuration="11.468593537s" podCreationTimestamp="2025-11-24 09:16:14 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.795050202 +0000 UTC m=+753.054027639" lastFinishedPulling="2025-11-24 09:16:23.946413228 +0000 UTC m=+761.205390675" observedRunningTime="2025-11-24 09:16:25.467269701 +0000 UTC m=+762.726247147" watchObservedRunningTime="2025-11-24 09:16:25.468593537 +0000 UTC m=+762.727570984" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.496366 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq" podStartSLOduration=3.9697973749999997 podStartE2EDuration="12.496353797s" podCreationTimestamp="2025-11-24 09:16:13 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.363355928 +0000 UTC m=+752.622333375" lastFinishedPulling="2025-11-24 09:16:23.889912349 +0000 UTC m=+761.148889797" observedRunningTime="2025-11-24 09:16:25.492259605 +0000 UTC m=+762.751237053" watchObservedRunningTime="2025-11-24 09:16:25.496353797 +0000 UTC m=+762.755331244" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.514328 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-6dc664666c-6flr8" podStartSLOduration=3.429475459 podStartE2EDuration="11.514320067s" podCreationTimestamp="2025-11-24 09:16:14 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.79560105 +0000 UTC m=+753.054578497" lastFinishedPulling="2025-11-24 09:16:23.880445658 +0000 UTC m=+761.139423105" observedRunningTime="2025-11-24 09:16:25.511512121 +0000 UTC m=+762.770489568" watchObservedRunningTime="2025-11-24 09:16:25.514320067 +0000 UTC m=+762.773297513" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.554761 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8667fbf6f6-k9wzp" podStartSLOduration=4.047453749 podStartE2EDuration="12.554731324s" podCreationTimestamp="2025-11-24 09:16:13 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.372452833 +0000 UTC m=+752.631430280" lastFinishedPulling="2025-11-24 09:16:23.879730408 +0000 UTC m=+761.138707855" observedRunningTime="2025-11-24 09:16:25.53572948 +0000 UTC m=+762.794706927" watchObservedRunningTime="2025-11-24 09:16:25.554731324 +0000 UTC m=+762.813708771" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.557894 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7bb88cb858-44jfn" podStartSLOduration=4.19277326 podStartE2EDuration="12.557886083s" podCreationTimestamp="2025-11-24 09:16:13 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.566258328 +0000 UTC m=+752.825235776" lastFinishedPulling="2025-11-24 09:16:23.931371152 +0000 UTC m=+761.190348599" observedRunningTime="2025-11-24 09:16:25.552672411 +0000 UTC m=+762.811649859" watchObservedRunningTime="2025-11-24 09:16:25.557886083 +0000 UTC m=+762.816863531" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.578173 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7879fb76fd-4tv9l" podStartSLOduration=4.250776141 podStartE2EDuration="12.578156499s" podCreationTimestamp="2025-11-24 09:16:13 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.568470621 +0000 UTC m=+752.827448068" lastFinishedPulling="2025-11-24 09:16:23.895850979 +0000 UTC m=+761.154828426" observedRunningTime="2025-11-24 09:16:25.574062308 +0000 UTC m=+762.833039755" watchObservedRunningTime="2025-11-24 09:16:25.578156499 +0000 UTC m=+762.837133947" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.627614 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6d8fd67bf7-jnx9f" podStartSLOduration=3.778610567 podStartE2EDuration="12.627594378s" podCreationTimestamp="2025-11-24 09:16:13 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.028840673 +0000 UTC m=+752.287818121" lastFinishedPulling="2025-11-24 09:16:23.877824485 +0000 UTC m=+761.136801932" observedRunningTime="2025-11-24 09:16:25.603506143 +0000 UTC m=+762.862483590" watchObservedRunningTime="2025-11-24 09:16:25.627594378 +0000 UTC m=+762.886571825" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.638597 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6cb9dc54f8-m7w2q" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.657230 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5c75d7c94b-ltqbl" podStartSLOduration=4.55508278 podStartE2EDuration="12.657213935s" podCreationTimestamp="2025-11-24 09:16:13 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.795181369 +0000 UTC m=+753.054158816" lastFinishedPulling="2025-11-24 09:16:23.897312524 +0000 UTC m=+761.156289971" observedRunningTime="2025-11-24 09:16:25.631016882 +0000 UTC m=+762.889994330" watchObservedRunningTime="2025-11-24 09:16:25.657213935 +0000 UTC m=+762.916191381" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.657537 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5d86b44686-4x76m" podStartSLOduration=4.140257978 podStartE2EDuration="12.657532355s" podCreationTimestamp="2025-11-24 09:16:13 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.372487939 +0000 UTC m=+752.631465387" lastFinishedPulling="2025-11-24 09:16:23.889762317 +0000 UTC m=+761.148739764" observedRunningTime="2025-11-24 09:16:25.652055557 +0000 UTC m=+762.911033004" watchObservedRunningTime="2025-11-24 09:16:25.657532355 +0000 UTC m=+762.916509803" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.674117 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-56dfb6b67f-77wgb" podStartSLOduration=4.551229923 podStartE2EDuration="12.674089899s" podCreationTimestamp="2025-11-24 09:16:13 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.797147376 +0000 UTC m=+753.056124823" lastFinishedPulling="2025-11-24 09:16:23.920007352 +0000 UTC m=+761.178984799" observedRunningTime="2025-11-24 09:16:25.669486518 +0000 UTC m=+762.928463964" watchObservedRunningTime="2025-11-24 09:16:25.674089899 +0000 UTC m=+762.933067346" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.690679 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-bf4c6585d-tnxst" podStartSLOduration=4.153176729 podStartE2EDuration="12.690650338s" podCreationTimestamp="2025-11-24 09:16:13 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.364754706 +0000 UTC m=+752.623732153" lastFinishedPulling="2025-11-24 09:16:23.902228315 +0000 UTC m=+761.161205762" observedRunningTime="2025-11-24 09:16:25.68762855 +0000 UTC m=+762.946605997" watchObservedRunningTime="2025-11-24 09:16:25.690650338 +0000 UTC m=+762.949627785" Nov 24 09:16:25 crc kubenswrapper[4563]: I1124 09:16:25.713845 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-799cb6ffd6-wck8j" podStartSLOduration=3.597782872 podStartE2EDuration="11.713831533s" podCreationTimestamp="2025-11-24 09:16:14 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.772340085 +0000 UTC m=+753.031317532" lastFinishedPulling="2025-11-24 09:16:23.888388747 +0000 UTC m=+761.147366193" observedRunningTime="2025-11-24 09:16:25.708069678 +0000 UTC m=+762.967047125" watchObservedRunningTime="2025-11-24 09:16:25.713831533 +0000 UTC m=+762.972808980" Nov 24 09:16:26 crc kubenswrapper[4563]: I1124 09:16:26.454902 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7879fb76fd-4tv9l" event={"ID":"ebed0d67-0bac-4d1f-a2d0-2e367d78d157","Type":"ContainerStarted","Data":"e9c5349b233ed47aea515d2b8562e000fa5ccf4798fb5f97a85c8ded214ddb7a"} Nov 24 09:16:32 crc kubenswrapper[4563]: I1124 09:16:32.502629 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8464cf66df-chpfj" event={"ID":"238f517b-0e10-411c-8b3c-c6bdbe261159","Type":"ContainerStarted","Data":"ec9224b4e3e61a1bdca77c0a601240d423728efe6d1bad360f184cf61acbfef8"} Nov 24 09:16:32 crc kubenswrapper[4563]: I1124 09:16:32.503119 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8464cf66df-chpfj" Nov 24 09:16:32 crc kubenswrapper[4563]: I1124 09:16:32.517934 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-8464cf66df-chpfj" podStartSLOduration=2.9318126639999997 podStartE2EDuration="18.517915107s" podCreationTimestamp="2025-11-24 09:16:14 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.827955624 +0000 UTC m=+753.086933072" lastFinishedPulling="2025-11-24 09:16:31.414058067 +0000 UTC m=+768.673035515" observedRunningTime="2025-11-24 09:16:32.516213658 +0000 UTC m=+769.775191106" watchObservedRunningTime="2025-11-24 09:16:32.517915107 +0000 UTC m=+769.776892554" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.187621 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7768f8c84f-glf4s" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.254209 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6d8fd67bf7-jnx9f" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.289067 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8667fbf6f6-k9wzp" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.297144 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-56dfb6b67f-77wgb" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.309868 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-bf4c6585d-tnxst" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.332364 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5d86b44686-4x76m" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.351576 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-769d9c7585-4f5hq" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.366149 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5c75d7c94b-ltqbl" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.476807 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7bb88cb858-44jfn" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.520940 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" event={"ID":"974a1619-7c48-46d6-b639-5f965c6b747a","Type":"ContainerStarted","Data":"505d196813c05d7647934eef40253c835c4ceeb24d41505ef07e144e6cd0b4bf"} Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.521196 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.524736 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh" event={"ID":"9fb1ddc7-1195-412e-93ed-4799bc756bae","Type":"ContainerStarted","Data":"86d26b976971b0e57030fd368e76a26cb2ec236ffb4d44e0efa43c717ccf168a"} Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.525190 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.537299 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7879fb76fd-4tv9l" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.545752 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" podStartSLOduration=2.786699742 podStartE2EDuration="20.545734513s" podCreationTimestamp="2025-11-24 09:16:14 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.843472797 +0000 UTC m=+753.102450244" lastFinishedPulling="2025-11-24 09:16:33.602507567 +0000 UTC m=+770.861485015" observedRunningTime="2025-11-24 09:16:34.538350929 +0000 UTC m=+771.797328386" watchObservedRunningTime="2025-11-24 09:16:34.545734513 +0000 UTC m=+771.804711960" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.556858 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh" podStartSLOduration=2.7791433420000002 podStartE2EDuration="20.556848021s" podCreationTimestamp="2025-11-24 09:16:14 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.816747208 +0000 UTC m=+753.075724655" lastFinishedPulling="2025-11-24 09:16:33.594451886 +0000 UTC m=+770.853429334" observedRunningTime="2025-11-24 09:16:34.547538084 +0000 UTC m=+771.806515531" watchObservedRunningTime="2025-11-24 09:16:34.556848021 +0000 UTC m=+771.815825468" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.613058 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-86d796d84d-vkltr" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.662011 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6fdc856c5d-h78s9" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.720619 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-6dc664666c-6flr8" Nov 24 09:16:34 crc kubenswrapper[4563]: I1124 09:16:34.837058 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-799cb6ffd6-wck8j" Nov 24 09:16:38 crc kubenswrapper[4563]: I1124 09:16:38.987553 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:16:38 crc kubenswrapper[4563]: I1124 09:16:38.988350 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:16:42 crc kubenswrapper[4563]: I1124 09:16:42.583059 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" event={"ID":"ffcb9e74-1697-402a-b77b-5a3ecc832759","Type":"ContainerStarted","Data":"3bd7c246eacf3a7a2e245435aa1bd07d63f84c8455b22a88452560dd1164924b"} Nov 24 09:16:42 crc kubenswrapper[4563]: I1124 09:16:42.583588 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" Nov 24 09:16:42 crc kubenswrapper[4563]: I1124 09:16:42.585622 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk" event={"ID":"a30aea9a-f4c8-42a3-89bb-af9ffef55544","Type":"ContainerStarted","Data":"efdb7e667e7cd523dfc615cd6eac5848488f70e94195992415b9be50d2a5929b"} Nov 24 09:16:42 crc kubenswrapper[4563]: I1124 09:16:42.585865 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk" Nov 24 09:16:42 crc kubenswrapper[4563]: I1124 09:16:42.587397 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" event={"ID":"9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f","Type":"ContainerStarted","Data":"1dff5b3266ec87d8c3b1725c47f71d96c68b11ca47a601c96880918092a7518c"} Nov 24 09:16:42 crc kubenswrapper[4563]: I1124 09:16:42.587613 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" Nov 24 09:16:42 crc kubenswrapper[4563]: I1124 09:16:42.589476 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" event={"ID":"d5e12170-5cc0-4f8f-89d7-c64f38f2226e","Type":"ContainerStarted","Data":"795646d2a74bb82f8a1a112bd0301e0e6ca1b721b33be2842d12fd0d1a2a2bd0"} Nov 24 09:16:42 crc kubenswrapper[4563]: I1124 09:16:42.589712 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" Nov 24 09:16:42 crc kubenswrapper[4563]: I1124 09:16:42.590994 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g64g6" event={"ID":"00f5e4f8-193c-48df-b29f-8f359f263a5a","Type":"ContainerStarted","Data":"064e76dcef6ebdaf3b6a916a272eca5d936ab819a4237565c27291034ed5b870"} Nov 24 09:16:42 crc kubenswrapper[4563]: I1124 09:16:42.614118 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" podStartSLOduration=2.353709024 podStartE2EDuration="28.614102559s" podCreationTimestamp="2025-11-24 09:16:14 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.823960681 +0000 UTC m=+753.082938128" lastFinishedPulling="2025-11-24 09:16:42.084354216 +0000 UTC m=+779.343331663" observedRunningTime="2025-11-24 09:16:42.609548711 +0000 UTC m=+779.868526158" watchObservedRunningTime="2025-11-24 09:16:42.614102559 +0000 UTC m=+779.873080006" Nov 24 09:16:42 crc kubenswrapper[4563]: I1124 09:16:42.625958 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-g64g6" podStartSLOduration=2.396006005 podStartE2EDuration="28.625934691s" podCreationTimestamp="2025-11-24 09:16:14 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.885069779 +0000 UTC m=+753.144047226" lastFinishedPulling="2025-11-24 09:16:42.114998465 +0000 UTC m=+779.373975912" observedRunningTime="2025-11-24 09:16:42.62228037 +0000 UTC m=+779.881257817" watchObservedRunningTime="2025-11-24 09:16:42.625934691 +0000 UTC m=+779.884912128" Nov 24 09:16:42 crc kubenswrapper[4563]: I1124 09:16:42.640330 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" podStartSLOduration=2.3660625189999998 podStartE2EDuration="28.640318426s" podCreationTimestamp="2025-11-24 09:16:14 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.81855174 +0000 UTC m=+753.077529187" lastFinishedPulling="2025-11-24 09:16:42.092807648 +0000 UTC m=+779.351785094" observedRunningTime="2025-11-24 09:16:42.637468702 +0000 UTC m=+779.896446149" watchObservedRunningTime="2025-11-24 09:16:42.640318426 +0000 UTC m=+779.899295873" Nov 24 09:16:42 crc kubenswrapper[4563]: I1124 09:16:42.652091 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" podStartSLOduration=2.406775976 podStartE2EDuration="28.652081198s" podCreationTimestamp="2025-11-24 09:16:14 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.869678645 +0000 UTC m=+753.128656091" lastFinishedPulling="2025-11-24 09:16:42.114983867 +0000 UTC m=+779.373961313" observedRunningTime="2025-11-24 09:16:42.648951686 +0000 UTC m=+779.907929134" watchObservedRunningTime="2025-11-24 09:16:42.652081198 +0000 UTC m=+779.911058645" Nov 24 09:16:42 crc kubenswrapper[4563]: I1124 09:16:42.669424 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk" podStartSLOduration=2.469300755 podStartE2EDuration="28.669413784s" podCreationTimestamp="2025-11-24 09:16:14 +0000 UTC" firstStartedPulling="2025-11-24 09:16:15.883995053 +0000 UTC m=+753.142972500" lastFinishedPulling="2025-11-24 09:16:42.084108082 +0000 UTC m=+779.343085529" observedRunningTime="2025-11-24 09:16:42.667146358 +0000 UTC m=+779.926123805" watchObservedRunningTime="2025-11-24 09:16:42.669413784 +0000 UTC m=+779.928391231" Nov 24 09:16:44 crc kubenswrapper[4563]: I1124 09:16:44.802077 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5bdf4f7f7f-6n5jh" Nov 24 09:16:44 crc kubenswrapper[4563]: I1124 09:16:44.947301 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8464cf66df-chpfj" Nov 24 09:16:45 crc kubenswrapper[4563]: I1124 09:16:45.065529 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-79d88dcd444qmtr" Nov 24 09:16:54 crc kubenswrapper[4563]: I1124 09:16:54.556021 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6f8c5b86cb-94tjk" Nov 24 09:16:54 crc kubenswrapper[4563]: I1124 09:16:54.589655 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-66b7d6f598-fffcm" Nov 24 09:16:54 crc kubenswrapper[4563]: I1124 09:16:54.902707 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7798859c74-z5b6f" Nov 24 09:16:54 crc kubenswrapper[4563]: I1124 09:16:54.970819 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7cd4fb6f79-qhzw4" Nov 24 09:17:01 crc kubenswrapper[4563]: I1124 09:17:01.628998 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qzvm8"] Nov 24 09:17:01 crc kubenswrapper[4563]: I1124 09:17:01.631304 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:01 crc kubenswrapper[4563]: I1124 09:17:01.633415 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qzvm8"] Nov 24 09:17:01 crc kubenswrapper[4563]: I1124 09:17:01.767659 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/757c88fd-7bed-4810-a374-f1246f058983-utilities\") pod \"certified-operators-qzvm8\" (UID: \"757c88fd-7bed-4810-a374-f1246f058983\") " pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:01 crc kubenswrapper[4563]: I1124 09:17:01.767797 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrhb9\" (UniqueName: \"kubernetes.io/projected/757c88fd-7bed-4810-a374-f1246f058983-kube-api-access-mrhb9\") pod \"certified-operators-qzvm8\" (UID: \"757c88fd-7bed-4810-a374-f1246f058983\") " pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:01 crc kubenswrapper[4563]: I1124 09:17:01.767863 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/757c88fd-7bed-4810-a374-f1246f058983-catalog-content\") pod \"certified-operators-qzvm8\" (UID: \"757c88fd-7bed-4810-a374-f1246f058983\") " pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:01 crc kubenswrapper[4563]: I1124 09:17:01.869367 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/757c88fd-7bed-4810-a374-f1246f058983-catalog-content\") pod \"certified-operators-qzvm8\" (UID: \"757c88fd-7bed-4810-a374-f1246f058983\") " pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:01 crc kubenswrapper[4563]: I1124 09:17:01.869495 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/757c88fd-7bed-4810-a374-f1246f058983-utilities\") pod \"certified-operators-qzvm8\" (UID: \"757c88fd-7bed-4810-a374-f1246f058983\") " pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:01 crc kubenswrapper[4563]: I1124 09:17:01.869590 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrhb9\" (UniqueName: \"kubernetes.io/projected/757c88fd-7bed-4810-a374-f1246f058983-kube-api-access-mrhb9\") pod \"certified-operators-qzvm8\" (UID: \"757c88fd-7bed-4810-a374-f1246f058983\") " pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:01 crc kubenswrapper[4563]: I1124 09:17:01.869909 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/757c88fd-7bed-4810-a374-f1246f058983-catalog-content\") pod \"certified-operators-qzvm8\" (UID: \"757c88fd-7bed-4810-a374-f1246f058983\") " pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:01 crc kubenswrapper[4563]: I1124 09:17:01.869984 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/757c88fd-7bed-4810-a374-f1246f058983-utilities\") pod \"certified-operators-qzvm8\" (UID: \"757c88fd-7bed-4810-a374-f1246f058983\") " pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:01 crc kubenswrapper[4563]: I1124 09:17:01.887279 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrhb9\" (UniqueName: \"kubernetes.io/projected/757c88fd-7bed-4810-a374-f1246f058983-kube-api-access-mrhb9\") pod \"certified-operators-qzvm8\" (UID: \"757c88fd-7bed-4810-a374-f1246f058983\") " pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:01 crc kubenswrapper[4563]: I1124 09:17:01.946747 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:02 crc kubenswrapper[4563]: I1124 09:17:02.385429 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qzvm8"] Nov 24 09:17:02 crc kubenswrapper[4563]: W1124 09:17:02.389174 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod757c88fd_7bed_4810_a374_f1246f058983.slice/crio-9a845ec814ba7722458dc6ee492d6653eac2ff8f928330a0639e2343b5c93cc2 WatchSource:0}: Error finding container 9a845ec814ba7722458dc6ee492d6653eac2ff8f928330a0639e2343b5c93cc2: Status 404 returned error can't find the container with id 9a845ec814ba7722458dc6ee492d6653eac2ff8f928330a0639e2343b5c93cc2 Nov 24 09:17:02 crc kubenswrapper[4563]: I1124 09:17:02.732212 4563 generic.go:334] "Generic (PLEG): container finished" podID="757c88fd-7bed-4810-a374-f1246f058983" containerID="c40849eb4064ac80f1ff3d2435bda157518d20eb9b3bd89c5d1f4b61160d3bbf" exitCode=0 Nov 24 09:17:02 crc kubenswrapper[4563]: I1124 09:17:02.732341 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzvm8" event={"ID":"757c88fd-7bed-4810-a374-f1246f058983","Type":"ContainerDied","Data":"c40849eb4064ac80f1ff3d2435bda157518d20eb9b3bd89c5d1f4b61160d3bbf"} Nov 24 09:17:02 crc kubenswrapper[4563]: I1124 09:17:02.732601 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzvm8" event={"ID":"757c88fd-7bed-4810-a374-f1246f058983","Type":"ContainerStarted","Data":"9a845ec814ba7722458dc6ee492d6653eac2ff8f928330a0639e2343b5c93cc2"} Nov 24 09:17:03 crc kubenswrapper[4563]: I1124 09:17:03.742929 4563 generic.go:334] "Generic (PLEG): container finished" podID="757c88fd-7bed-4810-a374-f1246f058983" containerID="969b141569ae1ef134845d5f83f9ec41eb9f9d47c516fa38583831c3b76c5c91" exitCode=0 Nov 24 09:17:03 crc kubenswrapper[4563]: I1124 09:17:03.743023 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzvm8" event={"ID":"757c88fd-7bed-4810-a374-f1246f058983","Type":"ContainerDied","Data":"969b141569ae1ef134845d5f83f9ec41eb9f9d47c516fa38583831c3b76c5c91"} Nov 24 09:17:04 crc kubenswrapper[4563]: I1124 09:17:04.752875 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzvm8" event={"ID":"757c88fd-7bed-4810-a374-f1246f058983","Type":"ContainerStarted","Data":"f60157b6cab8eef58fe8a370219450715e305cbd7672d8ea52fd3b703fafc7d9"} Nov 24 09:17:04 crc kubenswrapper[4563]: I1124 09:17:04.770092 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qzvm8" podStartSLOduration=2.1626574 podStartE2EDuration="3.770075663s" podCreationTimestamp="2025-11-24 09:17:01 +0000 UTC" firstStartedPulling="2025-11-24 09:17:02.734629847 +0000 UTC m=+799.993607295" lastFinishedPulling="2025-11-24 09:17:04.34204811 +0000 UTC m=+801.601025558" observedRunningTime="2025-11-24 09:17:04.767382443 +0000 UTC m=+802.026359890" watchObservedRunningTime="2025-11-24 09:17:04.770075663 +0000 UTC m=+802.029053109" Nov 24 09:17:06 crc kubenswrapper[4563]: I1124 09:17:06.615087 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-246hk"] Nov 24 09:17:06 crc kubenswrapper[4563]: I1124 09:17:06.616852 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:06 crc kubenswrapper[4563]: I1124 09:17:06.642965 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-246hk"] Nov 24 09:17:06 crc kubenswrapper[4563]: I1124 09:17:06.647367 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbs8t\" (UniqueName: \"kubernetes.io/projected/a09025e4-08f3-453a-8356-3ef3eba4b04d-kube-api-access-cbs8t\") pod \"community-operators-246hk\" (UID: \"a09025e4-08f3-453a-8356-3ef3eba4b04d\") " pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:06 crc kubenswrapper[4563]: I1124 09:17:06.647402 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09025e4-08f3-453a-8356-3ef3eba4b04d-utilities\") pod \"community-operators-246hk\" (UID: \"a09025e4-08f3-453a-8356-3ef3eba4b04d\") " pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:06 crc kubenswrapper[4563]: I1124 09:17:06.647462 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09025e4-08f3-453a-8356-3ef3eba4b04d-catalog-content\") pod \"community-operators-246hk\" (UID: \"a09025e4-08f3-453a-8356-3ef3eba4b04d\") " pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:06 crc kubenswrapper[4563]: I1124 09:17:06.748519 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09025e4-08f3-453a-8356-3ef3eba4b04d-utilities\") pod \"community-operators-246hk\" (UID: \"a09025e4-08f3-453a-8356-3ef3eba4b04d\") " pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:06 crc kubenswrapper[4563]: I1124 09:17:06.749019 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09025e4-08f3-453a-8356-3ef3eba4b04d-utilities\") pod \"community-operators-246hk\" (UID: \"a09025e4-08f3-453a-8356-3ef3eba4b04d\") " pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:06 crc kubenswrapper[4563]: I1124 09:17:06.749030 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09025e4-08f3-453a-8356-3ef3eba4b04d-catalog-content\") pod \"community-operators-246hk\" (UID: \"a09025e4-08f3-453a-8356-3ef3eba4b04d\") " pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:06 crc kubenswrapper[4563]: I1124 09:17:06.749302 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbs8t\" (UniqueName: \"kubernetes.io/projected/a09025e4-08f3-453a-8356-3ef3eba4b04d-kube-api-access-cbs8t\") pod \"community-operators-246hk\" (UID: \"a09025e4-08f3-453a-8356-3ef3eba4b04d\") " pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:06 crc kubenswrapper[4563]: I1124 09:17:06.749379 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09025e4-08f3-453a-8356-3ef3eba4b04d-catalog-content\") pod \"community-operators-246hk\" (UID: \"a09025e4-08f3-453a-8356-3ef3eba4b04d\") " pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:06 crc kubenswrapper[4563]: I1124 09:17:06.774503 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbs8t\" (UniqueName: \"kubernetes.io/projected/a09025e4-08f3-453a-8356-3ef3eba4b04d-kube-api-access-cbs8t\") pod \"community-operators-246hk\" (UID: \"a09025e4-08f3-453a-8356-3ef3eba4b04d\") " pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:06 crc kubenswrapper[4563]: I1124 09:17:06.929918 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:07 crc kubenswrapper[4563]: I1124 09:17:07.396820 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-246hk"] Nov 24 09:17:07 crc kubenswrapper[4563]: I1124 09:17:07.777103 4563 generic.go:334] "Generic (PLEG): container finished" podID="a09025e4-08f3-453a-8356-3ef3eba4b04d" containerID="a75a87c5e3b2ebaafe2571c9294613809649a776345d09f67956e8a0c54018ad" exitCode=0 Nov 24 09:17:07 crc kubenswrapper[4563]: I1124 09:17:07.778536 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-246hk" event={"ID":"a09025e4-08f3-453a-8356-3ef3eba4b04d","Type":"ContainerDied","Data":"a75a87c5e3b2ebaafe2571c9294613809649a776345d09f67956e8a0c54018ad"} Nov 24 09:17:07 crc kubenswrapper[4563]: I1124 09:17:07.778653 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-246hk" event={"ID":"a09025e4-08f3-453a-8356-3ef3eba4b04d","Type":"ContainerStarted","Data":"ab01318b6cd33a72224a419f8239b42dab561f227c4553087a32e474a8eb3b89"} Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.329586 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-xdbpz"] Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.330677 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-xdbpz" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.333022 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.333117 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.334221 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.340221 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-xdbpz"] Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.341692 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jk4j7" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.379608 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d574c25-9f39-4836-952d-f75bbfd7ae95-config\") pod \"dnsmasq-dns-7bdd77c89-xdbpz\" (UID: \"0d574c25-9f39-4836-952d-f75bbfd7ae95\") " pod="openstack/dnsmasq-dns-7bdd77c89-xdbpz" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.379674 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gslwt\" (UniqueName: \"kubernetes.io/projected/0d574c25-9f39-4836-952d-f75bbfd7ae95-kube-api-access-gslwt\") pod \"dnsmasq-dns-7bdd77c89-xdbpz\" (UID: \"0d574c25-9f39-4836-952d-f75bbfd7ae95\") " pod="openstack/dnsmasq-dns-7bdd77c89-xdbpz" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.387650 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6584b49599-r4gs7"] Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.392627 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-r4gs7" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.395179 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.412137 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-r4gs7"] Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.481459 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-dns-svc\") pod \"dnsmasq-dns-6584b49599-r4gs7\" (UID: \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\") " pod="openstack/dnsmasq-dns-6584b49599-r4gs7" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.481821 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d574c25-9f39-4836-952d-f75bbfd7ae95-config\") pod \"dnsmasq-dns-7bdd77c89-xdbpz\" (UID: \"0d574c25-9f39-4836-952d-f75bbfd7ae95\") " pod="openstack/dnsmasq-dns-7bdd77c89-xdbpz" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.481862 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gslwt\" (UniqueName: \"kubernetes.io/projected/0d574c25-9f39-4836-952d-f75bbfd7ae95-kube-api-access-gslwt\") pod \"dnsmasq-dns-7bdd77c89-xdbpz\" (UID: \"0d574c25-9f39-4836-952d-f75bbfd7ae95\") " pod="openstack/dnsmasq-dns-7bdd77c89-xdbpz" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.482063 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-config\") pod \"dnsmasq-dns-6584b49599-r4gs7\" (UID: \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\") " pod="openstack/dnsmasq-dns-6584b49599-r4gs7" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.482093 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvbr6\" (UniqueName: \"kubernetes.io/projected/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-kube-api-access-hvbr6\") pod \"dnsmasq-dns-6584b49599-r4gs7\" (UID: \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\") " pod="openstack/dnsmasq-dns-6584b49599-r4gs7" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.482652 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d574c25-9f39-4836-952d-f75bbfd7ae95-config\") pod \"dnsmasq-dns-7bdd77c89-xdbpz\" (UID: \"0d574c25-9f39-4836-952d-f75bbfd7ae95\") " pod="openstack/dnsmasq-dns-7bdd77c89-xdbpz" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.500566 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gslwt\" (UniqueName: \"kubernetes.io/projected/0d574c25-9f39-4836-952d-f75bbfd7ae95-kube-api-access-gslwt\") pod \"dnsmasq-dns-7bdd77c89-xdbpz\" (UID: \"0d574c25-9f39-4836-952d-f75bbfd7ae95\") " pod="openstack/dnsmasq-dns-7bdd77c89-xdbpz" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.582864 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-config\") pod \"dnsmasq-dns-6584b49599-r4gs7\" (UID: \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\") " pod="openstack/dnsmasq-dns-6584b49599-r4gs7" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.582909 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvbr6\" (UniqueName: \"kubernetes.io/projected/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-kube-api-access-hvbr6\") pod \"dnsmasq-dns-6584b49599-r4gs7\" (UID: \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\") " pod="openstack/dnsmasq-dns-6584b49599-r4gs7" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.582948 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-dns-svc\") pod \"dnsmasq-dns-6584b49599-r4gs7\" (UID: \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\") " pod="openstack/dnsmasq-dns-6584b49599-r4gs7" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.584106 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-dns-svc\") pod \"dnsmasq-dns-6584b49599-r4gs7\" (UID: \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\") " pod="openstack/dnsmasq-dns-6584b49599-r4gs7" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.584179 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-config\") pod \"dnsmasq-dns-6584b49599-r4gs7\" (UID: \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\") " pod="openstack/dnsmasq-dns-6584b49599-r4gs7" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.599935 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvbr6\" (UniqueName: \"kubernetes.io/projected/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-kube-api-access-hvbr6\") pod \"dnsmasq-dns-6584b49599-r4gs7\" (UID: \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\") " pod="openstack/dnsmasq-dns-6584b49599-r4gs7" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.645364 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-xdbpz" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.718891 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-r4gs7" Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.796612 4563 generic.go:334] "Generic (PLEG): container finished" podID="a09025e4-08f3-453a-8356-3ef3eba4b04d" containerID="01a75bb2102d0fd5f39344426da692a6fff5a6f24cd136e5ac7fbd50c90b8e55" exitCode=0 Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.796676 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-246hk" event={"ID":"a09025e4-08f3-453a-8356-3ef3eba4b04d","Type":"ContainerDied","Data":"01a75bb2102d0fd5f39344426da692a6fff5a6f24cd136e5ac7fbd50c90b8e55"} Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.987852 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:17:08 crc kubenswrapper[4563]: I1124 09:17:08.987910 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:17:09 crc kubenswrapper[4563]: I1124 09:17:09.026936 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-xdbpz"] Nov 24 09:17:09 crc kubenswrapper[4563]: W1124 09:17:09.103198 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0fc0729_ebad_4f5e_9599_c04cc45fdfe7.slice/crio-b6bbcec5834f24e0d86dadff5fe98f560cd98e603038c0624eee9ee3edccc8b7 WatchSource:0}: Error finding container b6bbcec5834f24e0d86dadff5fe98f560cd98e603038c0624eee9ee3edccc8b7: Status 404 returned error can't find the container with id b6bbcec5834f24e0d86dadff5fe98f560cd98e603038c0624eee9ee3edccc8b7 Nov 24 09:17:09 crc kubenswrapper[4563]: I1124 09:17:09.104421 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-r4gs7"] Nov 24 09:17:09 crc kubenswrapper[4563]: I1124 09:17:09.806024 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6584b49599-r4gs7" event={"ID":"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7","Type":"ContainerStarted","Data":"b6bbcec5834f24e0d86dadff5fe98f560cd98e603038c0624eee9ee3edccc8b7"} Nov 24 09:17:09 crc kubenswrapper[4563]: I1124 09:17:09.807066 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdd77c89-xdbpz" event={"ID":"0d574c25-9f39-4836-952d-f75bbfd7ae95","Type":"ContainerStarted","Data":"6c768c894aaee7d6c48407b463604901fe789f1cf321f5a3b5bfa89369de01eb"} Nov 24 09:17:10 crc kubenswrapper[4563]: I1124 09:17:10.826658 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-246hk" event={"ID":"a09025e4-08f3-453a-8356-3ef3eba4b04d","Type":"ContainerStarted","Data":"a852efc0c1439e27e50602b5b6c9f248a056dbf0e600d7d04554e46e04beeed3"} Nov 24 09:17:10 crc kubenswrapper[4563]: I1124 09:17:10.848812 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-246hk" podStartSLOduration=2.090543288 podStartE2EDuration="4.848798531s" podCreationTimestamp="2025-11-24 09:17:06 +0000 UTC" firstStartedPulling="2025-11-24 09:17:07.780242498 +0000 UTC m=+805.039219945" lastFinishedPulling="2025-11-24 09:17:10.538497742 +0000 UTC m=+807.797475188" observedRunningTime="2025-11-24 09:17:10.845293419 +0000 UTC m=+808.104270866" watchObservedRunningTime="2025-11-24 09:17:10.848798531 +0000 UTC m=+808.107775977" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.546283 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-r4gs7"] Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.573404 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-d7bd7"] Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.575010 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.580178 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-d7bd7"] Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.629179 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a278b53-bccd-4b13-aeaf-14674dacdb41-dns-svc\") pod \"dnsmasq-dns-7c6d9948dc-d7bd7\" (UID: \"2a278b53-bccd-4b13-aeaf-14674dacdb41\") " pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.629275 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzjdx\" (UniqueName: \"kubernetes.io/projected/2a278b53-bccd-4b13-aeaf-14674dacdb41-kube-api-access-xzjdx\") pod \"dnsmasq-dns-7c6d9948dc-d7bd7\" (UID: \"2a278b53-bccd-4b13-aeaf-14674dacdb41\") " pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.629364 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a278b53-bccd-4b13-aeaf-14674dacdb41-config\") pod \"dnsmasq-dns-7c6d9948dc-d7bd7\" (UID: \"2a278b53-bccd-4b13-aeaf-14674dacdb41\") " pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.730912 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzjdx\" (UniqueName: \"kubernetes.io/projected/2a278b53-bccd-4b13-aeaf-14674dacdb41-kube-api-access-xzjdx\") pod \"dnsmasq-dns-7c6d9948dc-d7bd7\" (UID: \"2a278b53-bccd-4b13-aeaf-14674dacdb41\") " pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.731378 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a278b53-bccd-4b13-aeaf-14674dacdb41-config\") pod \"dnsmasq-dns-7c6d9948dc-d7bd7\" (UID: \"2a278b53-bccd-4b13-aeaf-14674dacdb41\") " pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.732234 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a278b53-bccd-4b13-aeaf-14674dacdb41-config\") pod \"dnsmasq-dns-7c6d9948dc-d7bd7\" (UID: \"2a278b53-bccd-4b13-aeaf-14674dacdb41\") " pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.732383 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a278b53-bccd-4b13-aeaf-14674dacdb41-dns-svc\") pod \"dnsmasq-dns-7c6d9948dc-d7bd7\" (UID: \"2a278b53-bccd-4b13-aeaf-14674dacdb41\") " pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.732951 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a278b53-bccd-4b13-aeaf-14674dacdb41-dns-svc\") pod \"dnsmasq-dns-7c6d9948dc-d7bd7\" (UID: \"2a278b53-bccd-4b13-aeaf-14674dacdb41\") " pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.783496 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzjdx\" (UniqueName: \"kubernetes.io/projected/2a278b53-bccd-4b13-aeaf-14674dacdb41-kube-api-access-xzjdx\") pod \"dnsmasq-dns-7c6d9948dc-d7bd7\" (UID: \"2a278b53-bccd-4b13-aeaf-14674dacdb41\") " pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.804703 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-xdbpz"] Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.825304 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-zhm5r"] Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.826752 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.834067 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf409104-ddfd-4643-a35c-3c34c6ce2d14-dns-svc\") pod \"dnsmasq-dns-6486446b9f-zhm5r\" (UID: \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\") " pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.834148 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf409104-ddfd-4643-a35c-3c34c6ce2d14-config\") pod \"dnsmasq-dns-6486446b9f-zhm5r\" (UID: \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\") " pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.834176 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbl4n\" (UniqueName: \"kubernetes.io/projected/cf409104-ddfd-4643-a35c-3c34c6ce2d14-kube-api-access-cbl4n\") pod \"dnsmasq-dns-6486446b9f-zhm5r\" (UID: \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\") " pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.840656 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-zhm5r"] Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.894458 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.935479 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf409104-ddfd-4643-a35c-3c34c6ce2d14-config\") pod \"dnsmasq-dns-6486446b9f-zhm5r\" (UID: \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\") " pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.935540 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbl4n\" (UniqueName: \"kubernetes.io/projected/cf409104-ddfd-4643-a35c-3c34c6ce2d14-kube-api-access-cbl4n\") pod \"dnsmasq-dns-6486446b9f-zhm5r\" (UID: \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\") " pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.935777 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf409104-ddfd-4643-a35c-3c34c6ce2d14-dns-svc\") pod \"dnsmasq-dns-6486446b9f-zhm5r\" (UID: \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\") " pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.936685 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf409104-ddfd-4643-a35c-3c34c6ce2d14-dns-svc\") pod \"dnsmasq-dns-6486446b9f-zhm5r\" (UID: \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\") " pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.936957 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf409104-ddfd-4643-a35c-3c34c6ce2d14-config\") pod \"dnsmasq-dns-6486446b9f-zhm5r\" (UID: \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\") " pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.947783 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.948813 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:11 crc kubenswrapper[4563]: I1124 09:17:11.958659 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbl4n\" (UniqueName: \"kubernetes.io/projected/cf409104-ddfd-4643-a35c-3c34c6ce2d14-kube-api-access-cbl4n\") pod \"dnsmasq-dns-6486446b9f-zhm5r\" (UID: \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\") " pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.010853 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.156165 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.391436 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-d7bd7"] Nov 24 09:17:12 crc kubenswrapper[4563]: W1124 09:17:12.405956 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a278b53_bccd_4b13_aeaf_14674dacdb41.slice/crio-1e49619eb37ef4b2e140fad8fe4e5c1ba67378fc744dfd898362836910d05fdf WatchSource:0}: Error finding container 1e49619eb37ef4b2e140fad8fe4e5c1ba67378fc744dfd898362836910d05fdf: Status 404 returned error can't find the container with id 1e49619eb37ef4b2e140fad8fe4e5c1ba67378fc744dfd898362836910d05fdf Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.588021 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-zhm5r"] Nov 24 09:17:12 crc kubenswrapper[4563]: W1124 09:17:12.598492 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf409104_ddfd_4643_a35c_3c34c6ce2d14.slice/crio-41adc451abd9e333d33c67feed1d720d586b413e1bc29163e3bb697cb0c7e9d3 WatchSource:0}: Error finding container 41adc451abd9e333d33c67feed1d720d586b413e1bc29163e3bb697cb0c7e9d3: Status 404 returned error can't find the container with id 41adc451abd9e333d33c67feed1d720d586b413e1bc29163e3bb697cb0c7e9d3 Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.682519 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.683823 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.685902 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-pntpb" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.687606 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.688245 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.688389 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.691619 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.691807 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.691991 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.693420 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.752032 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-server-conf\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.752091 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/18ec698b-354c-4d4e-9126-16c493474617-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.752151 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-config-data\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.752206 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.752245 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.752354 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlmx5\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-kube-api-access-mlmx5\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.752434 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/18ec698b-354c-4d4e-9126-16c493474617-pod-info\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.752486 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.752515 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.752650 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.752674 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.853936 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/18ec698b-354c-4d4e-9126-16c493474617-pod-info\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.853997 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.854032 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.854109 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.854129 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.854932 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-server-conf\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.854963 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-config-data\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.854980 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/18ec698b-354c-4d4e-9126-16c493474617-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.855014 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.855038 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.855074 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlmx5\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-kube-api-access-mlmx5\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.855827 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-config-data\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.854484 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.854701 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.856334 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.856814 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.859339 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-server-conf\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.860657 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/18ec698b-354c-4d4e-9126-16c493474617-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.861053 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/18ec698b-354c-4d4e-9126-16c493474617-pod-info\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.862198 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.868445 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.871132 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlmx5\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-kube-api-access-mlmx5\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.874972 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " pod="openstack/rabbitmq-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.902486 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" event={"ID":"2a278b53-bccd-4b13-aeaf-14674dacdb41","Type":"ContainerStarted","Data":"1e49619eb37ef4b2e140fad8fe4e5c1ba67378fc744dfd898362836910d05fdf"} Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.904955 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" event={"ID":"cf409104-ddfd-4643-a35c-3c34c6ce2d14","Type":"ContainerStarted","Data":"41adc451abd9e333d33c67feed1d720d586b413e1bc29163e3bb697cb0c7e9d3"} Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.942506 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.943706 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.946484 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.946857 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wfdxh" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.946878 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.947124 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.948744 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.953034 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.955942 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.959514 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 09:17:12 crc kubenswrapper[4563]: I1124 09:17:12.966421 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.010429 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.059113 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.059163 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.059191 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckl45\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-kube-api-access-ckl45\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.059221 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e4286a17-bf24-4c91-91cb-6e3f3d731d24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.059244 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.059273 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.059291 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e4286a17-bf24-4c91-91cb-6e3f3d731d24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.059310 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.059327 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.059349 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.059369 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.160522 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.160579 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.160610 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckl45\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-kube-api-access-ckl45\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.160664 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e4286a17-bf24-4c91-91cb-6e3f3d731d24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.160688 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.160720 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.160741 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e4286a17-bf24-4c91-91cb-6e3f3d731d24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.160760 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.160781 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.160801 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.160823 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.161449 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.161599 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.168842 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.169543 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.181807 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.184168 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.184663 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.184844 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.184865 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e4286a17-bf24-4c91-91cb-6e3f3d731d24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.186705 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e4286a17-bf24-4c91-91cb-6e3f3d731d24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.187488 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckl45\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-kube-api-access-ckl45\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.200328 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.210894 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qzvm8"] Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.270765 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.537118 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 09:17:13 crc kubenswrapper[4563]: W1124 09:17:13.551413 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18ec698b_354c_4d4e_9126_16c493474617.slice/crio-9f9424183d254ee3d4d77aeb91526ff42f93d3ed3e47cb81dcfc8c5f41cc2afc WatchSource:0}: Error finding container 9f9424183d254ee3d4d77aeb91526ff42f93d3ed3e47cb81dcfc8c5f41cc2afc: Status 404 returned error can't find the container with id 9f9424183d254ee3d4d77aeb91526ff42f93d3ed3e47cb81dcfc8c5f41cc2afc Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.770249 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 09:17:13 crc kubenswrapper[4563]: I1124 09:17:13.925197 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"18ec698b-354c-4d4e-9126-16c493474617","Type":"ContainerStarted","Data":"9f9424183d254ee3d4d77aeb91526ff42f93d3ed3e47cb81dcfc8c5f41cc2afc"} Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.231804 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.233669 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.238942 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.239011 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.241977 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-d5stm" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.244107 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.244367 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.246685 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.396841 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.396927 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-config-data-default\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.397173 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dh4k\" (UniqueName: \"kubernetes.io/projected/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-kube-api-access-6dh4k\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.397250 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-kolla-config\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.397339 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.397365 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.397450 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.397599 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.498932 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-kolla-config\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.499010 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.499041 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.499076 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.499188 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.499254 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.499294 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-config-data-default\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.499325 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dh4k\" (UniqueName: \"kubernetes.io/projected/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-kube-api-access-6dh4k\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.499493 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.499660 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.500927 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-kolla-config\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.502213 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-config-data-default\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.504364 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.507460 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.508684 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.516429 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dh4k\" (UniqueName: \"kubernetes.io/projected/b0de325e-9aea-4ee2-9cc4-093f3d8d3f65-kube-api-access-6dh4k\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.528380 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65\") " pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.558206 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 24 09:17:14 crc kubenswrapper[4563]: I1124 09:17:14.941755 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qzvm8" podUID="757c88fd-7bed-4810-a374-f1246f058983" containerName="registry-server" containerID="cri-o://f60157b6cab8eef58fe8a370219450715e305cbd7672d8ea52fd3b703fafc7d9" gracePeriod=2 Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.618487 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.621770 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.623860 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-5gf4r" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.624145 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.625297 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.625427 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.632948 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.729618 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.729784 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.730656 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.730701 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.730780 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.731067 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.731275 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.731391 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-245l2\" (UniqueName: \"kubernetes.io/projected/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-kube-api-access-245l2\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.832994 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.833046 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.833131 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.833151 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-245l2\" (UniqueName: \"kubernetes.io/projected/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-kube-api-access-245l2\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.833180 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.833200 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.833263 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.833283 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.833544 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.833769 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.835072 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.838069 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.839527 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.841843 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.846500 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.853948 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-245l2\" (UniqueName: \"kubernetes.io/projected/2c2b6368-21fd-4c13-b008-5fe4be95dc8d-kube-api-access-245l2\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.867816 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2c2b6368-21fd-4c13-b008-5fe4be95dc8d\") " pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.902225 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.903143 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.905290 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.905310 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.905878 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nds6x" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.912816 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.953462 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.957019 4563 generic.go:334] "Generic (PLEG): container finished" podID="757c88fd-7bed-4810-a374-f1246f058983" containerID="f60157b6cab8eef58fe8a370219450715e305cbd7672d8ea52fd3b703fafc7d9" exitCode=0 Nov 24 09:17:15 crc kubenswrapper[4563]: I1124 09:17:15.957074 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzvm8" event={"ID":"757c88fd-7bed-4810-a374-f1246f058983","Type":"ContainerDied","Data":"f60157b6cab8eef58fe8a370219450715e305cbd7672d8ea52fd3b703fafc7d9"} Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.036938 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdeec6b1-05d8-4275-839f-a02e22e26f61-combined-ca-bundle\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.037032 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvrns\" (UniqueName: \"kubernetes.io/projected/bdeec6b1-05d8-4275-839f-a02e22e26f61-kube-api-access-kvrns\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.037060 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bdeec6b1-05d8-4275-839f-a02e22e26f61-kolla-config\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.037123 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdeec6b1-05d8-4275-839f-a02e22e26f61-memcached-tls-certs\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.037148 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdeec6b1-05d8-4275-839f-a02e22e26f61-config-data\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.138446 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdeec6b1-05d8-4275-839f-a02e22e26f61-memcached-tls-certs\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.138501 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdeec6b1-05d8-4275-839f-a02e22e26f61-config-data\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.138528 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdeec6b1-05d8-4275-839f-a02e22e26f61-combined-ca-bundle\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.138584 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvrns\" (UniqueName: \"kubernetes.io/projected/bdeec6b1-05d8-4275-839f-a02e22e26f61-kube-api-access-kvrns\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.138608 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bdeec6b1-05d8-4275-839f-a02e22e26f61-kolla-config\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.139432 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bdeec6b1-05d8-4275-839f-a02e22e26f61-kolla-config\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.139452 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdeec6b1-05d8-4275-839f-a02e22e26f61-config-data\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.141409 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdeec6b1-05d8-4275-839f-a02e22e26f61-combined-ca-bundle\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.148564 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdeec6b1-05d8-4275-839f-a02e22e26f61-memcached-tls-certs\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.153376 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvrns\" (UniqueName: \"kubernetes.io/projected/bdeec6b1-05d8-4275-839f-a02e22e26f61-kube-api-access-kvrns\") pod \"memcached-0\" (UID: \"bdeec6b1-05d8-4275-839f-a02e22e26f61\") " pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.229299 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.930907 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.931494 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:16 crc kubenswrapper[4563]: I1124 09:17:16.975072 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:17 crc kubenswrapper[4563]: I1124 09:17:17.026673 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:17 crc kubenswrapper[4563]: I1124 09:17:17.208600 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-246hk"] Nov 24 09:17:17 crc kubenswrapper[4563]: W1124 09:17:17.466308 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4286a17_bf24_4c91_91cb_6e3f3d731d24.slice/crio-f96a73c1ef89c6805e0fed827ea5f529bbe6267141e628c088464edcf9779f53 WatchSource:0}: Error finding container f96a73c1ef89c6805e0fed827ea5f529bbe6267141e628c088464edcf9779f53: Status 404 returned error can't find the container with id f96a73c1ef89c6805e0fed827ea5f529bbe6267141e628c088464edcf9779f53 Nov 24 09:17:17 crc kubenswrapper[4563]: I1124 09:17:17.718190 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 09:17:17 crc kubenswrapper[4563]: I1124 09:17:17.721074 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 09:17:17 crc kubenswrapper[4563]: I1124 09:17:17.723399 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-5zlht" Nov 24 09:17:17 crc kubenswrapper[4563]: I1124 09:17:17.724543 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 09:17:17 crc kubenswrapper[4563]: I1124 09:17:17.780612 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whrk7\" (UniqueName: \"kubernetes.io/projected/8d28277e-c9e2-4e14-bcda-b8e7684ce6f2-kube-api-access-whrk7\") pod \"kube-state-metrics-0\" (UID: \"8d28277e-c9e2-4e14-bcda-b8e7684ce6f2\") " pod="openstack/kube-state-metrics-0" Nov 24 09:17:17 crc kubenswrapper[4563]: I1124 09:17:17.881556 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whrk7\" (UniqueName: \"kubernetes.io/projected/8d28277e-c9e2-4e14-bcda-b8e7684ce6f2-kube-api-access-whrk7\") pod \"kube-state-metrics-0\" (UID: \"8d28277e-c9e2-4e14-bcda-b8e7684ce6f2\") " pod="openstack/kube-state-metrics-0" Nov 24 09:17:17 crc kubenswrapper[4563]: I1124 09:17:17.899821 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whrk7\" (UniqueName: \"kubernetes.io/projected/8d28277e-c9e2-4e14-bcda-b8e7684ce6f2-kube-api-access-whrk7\") pod \"kube-state-metrics-0\" (UID: \"8d28277e-c9e2-4e14-bcda-b8e7684ce6f2\") " pod="openstack/kube-state-metrics-0" Nov 24 09:17:17 crc kubenswrapper[4563]: I1124 09:17:17.986936 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e4286a17-bf24-4c91-91cb-6e3f3d731d24","Type":"ContainerStarted","Data":"f96a73c1ef89c6805e0fed827ea5f529bbe6267141e628c088464edcf9779f53"} Nov 24 09:17:18 crc kubenswrapper[4563]: I1124 09:17:18.049246 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 09:17:18 crc kubenswrapper[4563]: I1124 09:17:18.995781 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-246hk" podUID="a09025e4-08f3-453a-8356-3ef3eba4b04d" containerName="registry-server" containerID="cri-o://a852efc0c1439e27e50602b5b6c9f248a056dbf0e600d7d04554e46e04beeed3" gracePeriod=2 Nov 24 09:17:20 crc kubenswrapper[4563]: I1124 09:17:20.006846 4563 generic.go:334] "Generic (PLEG): container finished" podID="a09025e4-08f3-453a-8356-3ef3eba4b04d" containerID="a852efc0c1439e27e50602b5b6c9f248a056dbf0e600d7d04554e46e04beeed3" exitCode=0 Nov 24 09:17:20 crc kubenswrapper[4563]: I1124 09:17:20.006921 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-246hk" event={"ID":"a09025e4-08f3-453a-8356-3ef3eba4b04d","Type":"ContainerDied","Data":"a852efc0c1439e27e50602b5b6c9f248a056dbf0e600d7d04554e46e04beeed3"} Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.811278 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qtfnl"] Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.813459 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.816357 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-ppmf9" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.816803 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.817302 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.822918 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-6z24f"] Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.824919 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.829911 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qtfnl"] Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.834446 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-6z24f"] Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.948865 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csnrs\" (UniqueName: \"kubernetes.io/projected/01d7f46a-ff30-4904-a63a-8d41cea54dd7-kube-api-access-csnrs\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.948957 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/241e854a-eb29-4933-98be-bad6b9295260-var-log-ovn\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.948986 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/01d7f46a-ff30-4904-a63a-8d41cea54dd7-var-run\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.949003 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/01d7f46a-ff30-4904-a63a-8d41cea54dd7-var-log\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.949029 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/01d7f46a-ff30-4904-a63a-8d41cea54dd7-etc-ovs\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.949130 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/241e854a-eb29-4933-98be-bad6b9295260-scripts\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.949172 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrpl2\" (UniqueName: \"kubernetes.io/projected/241e854a-eb29-4933-98be-bad6b9295260-kube-api-access-rrpl2\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.949285 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/241e854a-eb29-4933-98be-bad6b9295260-var-run\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.949356 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/241e854a-eb29-4933-98be-bad6b9295260-ovn-controller-tls-certs\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.949484 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/01d7f46a-ff30-4904-a63a-8d41cea54dd7-var-lib\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.949577 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/241e854a-eb29-4933-98be-bad6b9295260-var-run-ovn\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.949654 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01d7f46a-ff30-4904-a63a-8d41cea54dd7-scripts\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:21 crc kubenswrapper[4563]: I1124 09:17:21.949732 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241e854a-eb29-4933-98be-bad6b9295260-combined-ca-bundle\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:21 crc kubenswrapper[4563]: E1124 09:17:21.951477 4563 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f60157b6cab8eef58fe8a370219450715e305cbd7672d8ea52fd3b703fafc7d9 is running failed: container process not found" containerID="f60157b6cab8eef58fe8a370219450715e305cbd7672d8ea52fd3b703fafc7d9" cmd=["grpc_health_probe","-addr=:50051"] Nov 24 09:17:21 crc kubenswrapper[4563]: E1124 09:17:21.952045 4563 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f60157b6cab8eef58fe8a370219450715e305cbd7672d8ea52fd3b703fafc7d9 is running failed: container process not found" containerID="f60157b6cab8eef58fe8a370219450715e305cbd7672d8ea52fd3b703fafc7d9" cmd=["grpc_health_probe","-addr=:50051"] Nov 24 09:17:21 crc kubenswrapper[4563]: E1124 09:17:21.955707 4563 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f60157b6cab8eef58fe8a370219450715e305cbd7672d8ea52fd3b703fafc7d9 is running failed: container process not found" containerID="f60157b6cab8eef58fe8a370219450715e305cbd7672d8ea52fd3b703fafc7d9" cmd=["grpc_health_probe","-addr=:50051"] Nov 24 09:17:21 crc kubenswrapper[4563]: E1124 09:17:21.955755 4563 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f60157b6cab8eef58fe8a370219450715e305cbd7672d8ea52fd3b703fafc7d9 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-qzvm8" podUID="757c88fd-7bed-4810-a374-f1246f058983" containerName="registry-server" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.052131 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/01d7f46a-ff30-4904-a63a-8d41cea54dd7-var-lib\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.052263 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/241e854a-eb29-4933-98be-bad6b9295260-var-run-ovn\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.052319 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01d7f46a-ff30-4904-a63a-8d41cea54dd7-scripts\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.052378 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241e854a-eb29-4933-98be-bad6b9295260-combined-ca-bundle\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.052413 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csnrs\" (UniqueName: \"kubernetes.io/projected/01d7f46a-ff30-4904-a63a-8d41cea54dd7-kube-api-access-csnrs\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.052497 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/241e854a-eb29-4933-98be-bad6b9295260-var-log-ovn\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.052529 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/01d7f46a-ff30-4904-a63a-8d41cea54dd7-var-run\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.052546 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/01d7f46a-ff30-4904-a63a-8d41cea54dd7-var-log\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.052581 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/01d7f46a-ff30-4904-a63a-8d41cea54dd7-etc-ovs\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.052618 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/241e854a-eb29-4933-98be-bad6b9295260-scripts\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.052654 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrpl2\" (UniqueName: \"kubernetes.io/projected/241e854a-eb29-4933-98be-bad6b9295260-kube-api-access-rrpl2\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.052688 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/241e854a-eb29-4933-98be-bad6b9295260-var-run\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.052717 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/241e854a-eb29-4933-98be-bad6b9295260-ovn-controller-tls-certs\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.053036 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/241e854a-eb29-4933-98be-bad6b9295260-var-run-ovn\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.053165 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/01d7f46a-ff30-4904-a63a-8d41cea54dd7-var-log\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.053385 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/01d7f46a-ff30-4904-a63a-8d41cea54dd7-var-lib\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.053392 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/01d7f46a-ff30-4904-a63a-8d41cea54dd7-etc-ovs\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.053533 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/01d7f46a-ff30-4904-a63a-8d41cea54dd7-var-run\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.053568 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/241e854a-eb29-4933-98be-bad6b9295260-var-run\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.053579 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/241e854a-eb29-4933-98be-bad6b9295260-var-log-ovn\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.055405 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/241e854a-eb29-4933-98be-bad6b9295260-scripts\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.056240 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01d7f46a-ff30-4904-a63a-8d41cea54dd7-scripts\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.058273 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/241e854a-eb29-4933-98be-bad6b9295260-ovn-controller-tls-certs\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.058418 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241e854a-eb29-4933-98be-bad6b9295260-combined-ca-bundle\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.071272 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csnrs\" (UniqueName: \"kubernetes.io/projected/01d7f46a-ff30-4904-a63a-8d41cea54dd7-kube-api-access-csnrs\") pod \"ovn-controller-ovs-6z24f\" (UID: \"01d7f46a-ff30-4904-a63a-8d41cea54dd7\") " pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.072007 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrpl2\" (UniqueName: \"kubernetes.io/projected/241e854a-eb29-4933-98be-bad6b9295260-kube-api-access-rrpl2\") pod \"ovn-controller-qtfnl\" (UID: \"241e854a-eb29-4933-98be-bad6b9295260\") " pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.136238 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.151397 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.734581 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.736051 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.740371 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.740486 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-tqkwx" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.741241 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.741367 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.741482 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.746996 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.865883 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3399d213-46c4-42c1-9d69-26246c4ed771-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.865941 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3399d213-46c4-42c1-9d69-26246c4ed771-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.866080 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.866177 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stb7x\" (UniqueName: \"kubernetes.io/projected/3399d213-46c4-42c1-9d69-26246c4ed771-kube-api-access-stb7x\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.866292 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3399d213-46c4-42c1-9d69-26246c4ed771-config\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.866318 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3399d213-46c4-42c1-9d69-26246c4ed771-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.866362 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3399d213-46c4-42c1-9d69-26246c4ed771-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.866475 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3399d213-46c4-42c1-9d69-26246c4ed771-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.968096 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3399d213-46c4-42c1-9d69-26246c4ed771-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.968139 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3399d213-46c4-42c1-9d69-26246c4ed771-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.968222 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.968251 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stb7x\" (UniqueName: \"kubernetes.io/projected/3399d213-46c4-42c1-9d69-26246c4ed771-kube-api-access-stb7x\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.968299 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3399d213-46c4-42c1-9d69-26246c4ed771-config\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.968320 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3399d213-46c4-42c1-9d69-26246c4ed771-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.968350 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3399d213-46c4-42c1-9d69-26246c4ed771-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.968421 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3399d213-46c4-42c1-9d69-26246c4ed771-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.970077 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3399d213-46c4-42c1-9d69-26246c4ed771-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.970327 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.973279 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3399d213-46c4-42c1-9d69-26246c4ed771-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.974086 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3399d213-46c4-42c1-9d69-26246c4ed771-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.975245 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3399d213-46c4-42c1-9d69-26246c4ed771-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.979811 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3399d213-46c4-42c1-9d69-26246c4ed771-config\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.981416 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3399d213-46c4-42c1-9d69-26246c4ed771-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.982592 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stb7x\" (UniqueName: \"kubernetes.io/projected/3399d213-46c4-42c1-9d69-26246c4ed771-kube-api-access-stb7x\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:22 crc kubenswrapper[4563]: I1124 09:17:22.990242 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3399d213-46c4-42c1-9d69-26246c4ed771\") " pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:23 crc kubenswrapper[4563]: I1124 09:17:23.059108 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.425876 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.427477 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.429595 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.429618 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-l9phw" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.429623 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.429705 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.438482 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.614085 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04022879-4b41-4e57-ae94-a3517d382e7d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.614146 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04022879-4b41-4e57-ae94-a3517d382e7d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.614198 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfghp\" (UniqueName: \"kubernetes.io/projected/04022879-4b41-4e57-ae94-a3517d382e7d-kube-api-access-mfghp\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.614309 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.614337 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04022879-4b41-4e57-ae94-a3517d382e7d-config\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.614358 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/04022879-4b41-4e57-ae94-a3517d382e7d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.614387 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/04022879-4b41-4e57-ae94-a3517d382e7d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.614426 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/04022879-4b41-4e57-ae94-a3517d382e7d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.717270 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.717341 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04022879-4b41-4e57-ae94-a3517d382e7d-config\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.717378 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/04022879-4b41-4e57-ae94-a3517d382e7d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.717423 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/04022879-4b41-4e57-ae94-a3517d382e7d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.717499 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/04022879-4b41-4e57-ae94-a3517d382e7d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.717725 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.717914 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04022879-4b41-4e57-ae94-a3517d382e7d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.717952 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04022879-4b41-4e57-ae94-a3517d382e7d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.717994 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfghp\" (UniqueName: \"kubernetes.io/projected/04022879-4b41-4e57-ae94-a3517d382e7d-kube-api-access-mfghp\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.718326 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/04022879-4b41-4e57-ae94-a3517d382e7d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.719123 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04022879-4b41-4e57-ae94-a3517d382e7d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.719189 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04022879-4b41-4e57-ae94-a3517d382e7d-config\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.721381 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/04022879-4b41-4e57-ae94-a3517d382e7d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.721565 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/04022879-4b41-4e57-ae94-a3517d382e7d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.721883 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04022879-4b41-4e57-ae94-a3517d382e7d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.732870 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfghp\" (UniqueName: \"kubernetes.io/projected/04022879-4b41-4e57-ae94-a3517d382e7d-kube-api-access-mfghp\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.736044 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"04022879-4b41-4e57-ae94-a3517d382e7d\") " pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:25 crc kubenswrapper[4563]: I1124 09:17:25.741730 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:26 crc kubenswrapper[4563]: E1124 09:17:26.931284 4563 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a852efc0c1439e27e50602b5b6c9f248a056dbf0e600d7d04554e46e04beeed3 is running failed: container process not found" containerID="a852efc0c1439e27e50602b5b6c9f248a056dbf0e600d7d04554e46e04beeed3" cmd=["grpc_health_probe","-addr=:50051"] Nov 24 09:17:26 crc kubenswrapper[4563]: E1124 09:17:26.932252 4563 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a852efc0c1439e27e50602b5b6c9f248a056dbf0e600d7d04554e46e04beeed3 is running failed: container process not found" containerID="a852efc0c1439e27e50602b5b6c9f248a056dbf0e600d7d04554e46e04beeed3" cmd=["grpc_health_probe","-addr=:50051"] Nov 24 09:17:26 crc kubenswrapper[4563]: E1124 09:17:26.932584 4563 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a852efc0c1439e27e50602b5b6c9f248a056dbf0e600d7d04554e46e04beeed3 is running failed: container process not found" containerID="a852efc0c1439e27e50602b5b6c9f248a056dbf0e600d7d04554e46e04beeed3" cmd=["grpc_health_probe","-addr=:50051"] Nov 24 09:17:26 crc kubenswrapper[4563]: E1124 09:17:26.932662 4563 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a852efc0c1439e27e50602b5b6c9f248a056dbf0e600d7d04554e46e04beeed3 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-246hk" podUID="a09025e4-08f3-453a-8356-3ef3eba4b04d" containerName="registry-server" Nov 24 09:17:27 crc kubenswrapper[4563]: I1124 09:17:27.113066 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:27 crc kubenswrapper[4563]: I1124 09:17:27.248298 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/757c88fd-7bed-4810-a374-f1246f058983-utilities\") pod \"757c88fd-7bed-4810-a374-f1246f058983\" (UID: \"757c88fd-7bed-4810-a374-f1246f058983\") " Nov 24 09:17:27 crc kubenswrapper[4563]: I1124 09:17:27.248712 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrhb9\" (UniqueName: \"kubernetes.io/projected/757c88fd-7bed-4810-a374-f1246f058983-kube-api-access-mrhb9\") pod \"757c88fd-7bed-4810-a374-f1246f058983\" (UID: \"757c88fd-7bed-4810-a374-f1246f058983\") " Nov 24 09:17:27 crc kubenswrapper[4563]: I1124 09:17:27.248830 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/757c88fd-7bed-4810-a374-f1246f058983-catalog-content\") pod \"757c88fd-7bed-4810-a374-f1246f058983\" (UID: \"757c88fd-7bed-4810-a374-f1246f058983\") " Nov 24 09:17:27 crc kubenswrapper[4563]: I1124 09:17:27.251664 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/757c88fd-7bed-4810-a374-f1246f058983-utilities" (OuterVolumeSpecName: "utilities") pod "757c88fd-7bed-4810-a374-f1246f058983" (UID: "757c88fd-7bed-4810-a374-f1246f058983"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:17:27 crc kubenswrapper[4563]: I1124 09:17:27.256480 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757c88fd-7bed-4810-a374-f1246f058983-kube-api-access-mrhb9" (OuterVolumeSpecName: "kube-api-access-mrhb9") pod "757c88fd-7bed-4810-a374-f1246f058983" (UID: "757c88fd-7bed-4810-a374-f1246f058983"). InnerVolumeSpecName "kube-api-access-mrhb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:17:27 crc kubenswrapper[4563]: I1124 09:17:27.291025 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/757c88fd-7bed-4810-a374-f1246f058983-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "757c88fd-7bed-4810-a374-f1246f058983" (UID: "757c88fd-7bed-4810-a374-f1246f058983"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:17:27 crc kubenswrapper[4563]: I1124 09:17:27.352048 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/757c88fd-7bed-4810-a374-f1246f058983-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:27 crc kubenswrapper[4563]: I1124 09:17:27.352081 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrhb9\" (UniqueName: \"kubernetes.io/projected/757c88fd-7bed-4810-a374-f1246f058983-kube-api-access-mrhb9\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:27 crc kubenswrapper[4563]: I1124 09:17:27.352095 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/757c88fd-7bed-4810-a374-f1246f058983-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:27 crc kubenswrapper[4563]: E1124 09:17:27.665452 4563 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba" Nov 24 09:17:27 crc kubenswrapper[4563]: E1124 09:17:27.665623 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xzjdx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7c6d9948dc-d7bd7_openstack(2a278b53-bccd-4b13-aeaf-14674dacdb41): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 09:17:27 crc kubenswrapper[4563]: E1124 09:17:27.666845 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" podUID="2a278b53-bccd-4b13-aeaf-14674dacdb41" Nov 24 09:17:27 crc kubenswrapper[4563]: E1124 09:17:27.675224 4563 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba" Nov 24 09:17:27 crc kubenswrapper[4563]: E1124 09:17:27.675408 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hvbr6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6584b49599-r4gs7_openstack(d0fc0729-ebad-4f5e-9599-c04cc45fdfe7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 09:17:27 crc kubenswrapper[4563]: E1124 09:17:27.676574 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6584b49599-r4gs7" podUID="d0fc0729-ebad-4f5e-9599-c04cc45fdfe7" Nov 24 09:17:27 crc kubenswrapper[4563]: E1124 09:17:27.676917 4563 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba" Nov 24 09:17:27 crc kubenswrapper[4563]: E1124 09:17:27.677050 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cbl4n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6486446b9f-zhm5r_openstack(cf409104-ddfd-4643-a35c-3c34c6ce2d14): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 09:17:27 crc kubenswrapper[4563]: E1124 09:17:27.678882 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" podUID="cf409104-ddfd-4643-a35c-3c34c6ce2d14" Nov 24 09:17:27 crc kubenswrapper[4563]: E1124 09:17:27.680572 4563 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba" Nov 24 09:17:27 crc kubenswrapper[4563]: E1124 09:17:27.680729 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gslwt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7bdd77c89-xdbpz_openstack(0d574c25-9f39-4836-952d-f75bbfd7ae95): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 09:17:27 crc kubenswrapper[4563]: E1124 09:17:27.682130 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7bdd77c89-xdbpz" podUID="0d574c25-9f39-4836-952d-f75bbfd7ae95" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.076150 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzvm8" event={"ID":"757c88fd-7bed-4810-a374-f1246f058983","Type":"ContainerDied","Data":"9a845ec814ba7722458dc6ee492d6653eac2ff8f928330a0639e2343b5c93cc2"} Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.076873 4563 scope.go:117] "RemoveContainer" containerID="f60157b6cab8eef58fe8a370219450715e305cbd7672d8ea52fd3b703fafc7d9" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.076438 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzvm8" Nov 24 09:17:28 crc kubenswrapper[4563]: E1124 09:17:28.077694 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba\\\"\"" pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" podUID="cf409104-ddfd-4643-a35c-3c34c6ce2d14" Nov 24 09:17:28 crc kubenswrapper[4563]: E1124 09:17:28.079785 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba\\\"\"" pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" podUID="2a278b53-bccd-4b13-aeaf-14674dacdb41" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.161444 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qzvm8"] Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.167810 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qzvm8"] Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.687044 4563 scope.go:117] "RemoveContainer" containerID="969b141569ae1ef134845d5f83f9ec41eb9f9d47c516fa38583831c3b76c5c91" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.815230 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.820421 4563 scope.go:117] "RemoveContainer" containerID="c40849eb4064ac80f1ff3d2435bda157518d20eb9b3bd89c5d1f4b61160d3bbf" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.872874 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-r4gs7" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.888775 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbs8t\" (UniqueName: \"kubernetes.io/projected/a09025e4-08f3-453a-8356-3ef3eba4b04d-kube-api-access-cbs8t\") pod \"a09025e4-08f3-453a-8356-3ef3eba4b04d\" (UID: \"a09025e4-08f3-453a-8356-3ef3eba4b04d\") " Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.888840 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-config\") pod \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\" (UID: \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\") " Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.888883 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09025e4-08f3-453a-8356-3ef3eba4b04d-catalog-content\") pod \"a09025e4-08f3-453a-8356-3ef3eba4b04d\" (UID: \"a09025e4-08f3-453a-8356-3ef3eba4b04d\") " Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.888919 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09025e4-08f3-453a-8356-3ef3eba4b04d-utilities\") pod \"a09025e4-08f3-453a-8356-3ef3eba4b04d\" (UID: \"a09025e4-08f3-453a-8356-3ef3eba4b04d\") " Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.888964 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-dns-svc\") pod \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\" (UID: \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\") " Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.889009 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvbr6\" (UniqueName: \"kubernetes.io/projected/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-kube-api-access-hvbr6\") pod \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\" (UID: \"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7\") " Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.890279 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d0fc0729-ebad-4f5e-9599-c04cc45fdfe7" (UID: "d0fc0729-ebad-4f5e-9599-c04cc45fdfe7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.890504 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09025e4-08f3-453a-8356-3ef3eba4b04d-utilities" (OuterVolumeSpecName: "utilities") pod "a09025e4-08f3-453a-8356-3ef3eba4b04d" (UID: "a09025e4-08f3-453a-8356-3ef3eba4b04d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.890909 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-config" (OuterVolumeSpecName: "config") pod "d0fc0729-ebad-4f5e-9599-c04cc45fdfe7" (UID: "d0fc0729-ebad-4f5e-9599-c04cc45fdfe7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.893517 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-kube-api-access-hvbr6" (OuterVolumeSpecName: "kube-api-access-hvbr6") pod "d0fc0729-ebad-4f5e-9599-c04cc45fdfe7" (UID: "d0fc0729-ebad-4f5e-9599-c04cc45fdfe7"). InnerVolumeSpecName "kube-api-access-hvbr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.895466 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a09025e4-08f3-453a-8356-3ef3eba4b04d-kube-api-access-cbs8t" (OuterVolumeSpecName: "kube-api-access-cbs8t") pod "a09025e4-08f3-453a-8356-3ef3eba4b04d" (UID: "a09025e4-08f3-453a-8356-3ef3eba4b04d"). InnerVolumeSpecName "kube-api-access-cbs8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.897207 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-xdbpz" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.929606 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09025e4-08f3-453a-8356-3ef3eba4b04d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a09025e4-08f3-453a-8356-3ef3eba4b04d" (UID: "a09025e4-08f3-453a-8356-3ef3eba4b04d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.990478 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gslwt\" (UniqueName: \"kubernetes.io/projected/0d574c25-9f39-4836-952d-f75bbfd7ae95-kube-api-access-gslwt\") pod \"0d574c25-9f39-4836-952d-f75bbfd7ae95\" (UID: \"0d574c25-9f39-4836-952d-f75bbfd7ae95\") " Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.990770 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d574c25-9f39-4836-952d-f75bbfd7ae95-config\") pod \"0d574c25-9f39-4836-952d-f75bbfd7ae95\" (UID: \"0d574c25-9f39-4836-952d-f75bbfd7ae95\") " Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.991155 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.991168 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvbr6\" (UniqueName: \"kubernetes.io/projected/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-kube-api-access-hvbr6\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.991194 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbs8t\" (UniqueName: \"kubernetes.io/projected/a09025e4-08f3-453a-8356-3ef3eba4b04d-kube-api-access-cbs8t\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.991253 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.991263 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09025e4-08f3-453a-8356-3ef3eba4b04d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.991272 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09025e4-08f3-453a-8356-3ef3eba4b04d-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.992230 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d574c25-9f39-4836-952d-f75bbfd7ae95-config" (OuterVolumeSpecName: "config") pod "0d574c25-9f39-4836-952d-f75bbfd7ae95" (UID: "0d574c25-9f39-4836-952d-f75bbfd7ae95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:28 crc kubenswrapper[4563]: I1124 09:17:28.994940 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d574c25-9f39-4836-952d-f75bbfd7ae95-kube-api-access-gslwt" (OuterVolumeSpecName: "kube-api-access-gslwt") pod "0d574c25-9f39-4836-952d-f75bbfd7ae95" (UID: "0d574c25-9f39-4836-952d-f75bbfd7ae95"). InnerVolumeSpecName "kube-api-access-gslwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.008883 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 24 09:17:29 crc kubenswrapper[4563]: W1124 09:17:29.009960 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c2b6368_21fd_4c13_b008_5fe4be95dc8d.slice/crio-c7b7c1fdec2f163ed386092679ffa12720c9545e1e8106d690bea556cc4c0899 WatchSource:0}: Error finding container c7b7c1fdec2f163ed386092679ffa12720c9545e1e8106d690bea556cc4c0899: Status 404 returned error can't find the container with id c7b7c1fdec2f163ed386092679ffa12720c9545e1e8106d690bea556cc4c0899 Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.068121 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="757c88fd-7bed-4810-a374-f1246f058983" path="/var/lib/kubelet/pods/757c88fd-7bed-4810-a374-f1246f058983/volumes" Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.084480 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-246hk" event={"ID":"a09025e4-08f3-453a-8356-3ef3eba4b04d","Type":"ContainerDied","Data":"ab01318b6cd33a72224a419f8239b42dab561f227c4553087a32e474a8eb3b89"} Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.084528 4563 scope.go:117] "RemoveContainer" containerID="a852efc0c1439e27e50602b5b6c9f248a056dbf0e600d7d04554e46e04beeed3" Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.084689 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-246hk" Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.086694 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6584b49599-r4gs7" event={"ID":"d0fc0729-ebad-4f5e-9599-c04cc45fdfe7","Type":"ContainerDied","Data":"b6bbcec5834f24e0d86dadff5fe98f560cd98e603038c0624eee9ee3edccc8b7"} Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.086714 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-r4gs7" Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.092365 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gslwt\" (UniqueName: \"kubernetes.io/projected/0d574c25-9f39-4836-952d-f75bbfd7ae95-kube-api-access-gslwt\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.092394 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d574c25-9f39-4836-952d-f75bbfd7ae95-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.093581 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdd77c89-xdbpz" event={"ID":"0d574c25-9f39-4836-952d-f75bbfd7ae95","Type":"ContainerDied","Data":"6c768c894aaee7d6c48407b463604901fe789f1cf321f5a3b5bfa89369de01eb"} Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.093674 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-xdbpz" Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.096754 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2c2b6368-21fd-4c13-b008-5fe4be95dc8d","Type":"ContainerStarted","Data":"c7b7c1fdec2f163ed386092679ffa12720c9545e1e8106d690bea556cc4c0899"} Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.106367 4563 scope.go:117] "RemoveContainer" containerID="01a75bb2102d0fd5f39344426da692a6fff5a6f24cd136e5ac7fbd50c90b8e55" Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.136094 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-xdbpz"] Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.143982 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-xdbpz"] Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.153046 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-246hk"] Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.160957 4563 scope.go:117] "RemoveContainer" containerID="a75a87c5e3b2ebaafe2571c9294613809649a776345d09f67956e8a0c54018ad" Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.164092 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-246hk"] Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.173039 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-r4gs7"] Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.177860 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-r4gs7"] Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.186330 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.223291 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.263784 4563 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 09:17:29 crc kubenswrapper[4563]: W1124 09:17:29.267721 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdeec6b1_05d8_4275_839f_a02e22e26f61.slice/crio-8eaaa92da9bfb74182b248aaf7e3e11952a2549eb4903efaead580bf944ae1f5 WatchSource:0}: Error finding container 8eaaa92da9bfb74182b248aaf7e3e11952a2549eb4903efaead580bf944ae1f5: Status 404 returned error can't find the container with id 8eaaa92da9bfb74182b248aaf7e3e11952a2549eb4903efaead580bf944ae1f5 Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.288332 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 24 09:17:29 crc kubenswrapper[4563]: W1124 09:17:29.290138 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3399d213_46c4_42c1_9d69_26246c4ed771.slice/crio-04a6529fb258c099991bed2405554a357605592fe3676b66647c87dcbe19d2be WatchSource:0}: Error finding container 04a6529fb258c099991bed2405554a357605592fe3676b66647c87dcbe19d2be: Status 404 returned error can't find the container with id 04a6529fb258c099991bed2405554a357605592fe3676b66647c87dcbe19d2be Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.329913 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qtfnl"] Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.335188 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 09:17:29 crc kubenswrapper[4563]: W1124 09:17:29.343684 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod241e854a_eb29_4933_98be_bad6b9295260.slice/crio-3e8c44b73e512d69791babaa0975f29d59d79278c9fc063806fd2914b1dfbd22 WatchSource:0}: Error finding container 3e8c44b73e512d69791babaa0975f29d59d79278c9fc063806fd2914b1dfbd22: Status 404 returned error can't find the container with id 3e8c44b73e512d69791babaa0975f29d59d79278c9fc063806fd2914b1dfbd22 Nov 24 09:17:29 crc kubenswrapper[4563]: W1124 09:17:29.345087 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d28277e_c9e2_4e14_bcda_b8e7684ce6f2.slice/crio-556f1a86d4d9bd6d35e66635f8b463fcd483675a1fb633a05684859f95200bbd WatchSource:0}: Error finding container 556f1a86d4d9bd6d35e66635f8b463fcd483675a1fb633a05684859f95200bbd: Status 404 returned error can't find the container with id 556f1a86d4d9bd6d35e66635f8b463fcd483675a1fb633a05684859f95200bbd Nov 24 09:17:29 crc kubenswrapper[4563]: I1124 09:17:29.407394 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 24 09:17:29 crc kubenswrapper[4563]: W1124 09:17:29.462681 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04022879_4b41_4e57_ae94_a3517d382e7d.slice/crio-3596b0e35d800311d0fab21e850a664942c9e2a832f0f57671cbc2caf7f01a08 WatchSource:0}: Error finding container 3596b0e35d800311d0fab21e850a664942c9e2a832f0f57671cbc2caf7f01a08: Status 404 returned error can't find the container with id 3596b0e35d800311d0fab21e850a664942c9e2a832f0f57671cbc2caf7f01a08 Nov 24 09:17:30 crc kubenswrapper[4563]: I1124 09:17:30.045845 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-6z24f"] Nov 24 09:17:30 crc kubenswrapper[4563]: I1124 09:17:30.106382 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8d28277e-c9e2-4e14-bcda-b8e7684ce6f2","Type":"ContainerStarted","Data":"556f1a86d4d9bd6d35e66635f8b463fcd483675a1fb633a05684859f95200bbd"} Nov 24 09:17:30 crc kubenswrapper[4563]: I1124 09:17:30.108960 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65","Type":"ContainerStarted","Data":"4c5999a01aea56bfb5c36fe4f770f2422cda552b0aad887328733f2382bb34d0"} Nov 24 09:17:30 crc kubenswrapper[4563]: I1124 09:17:30.110663 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"04022879-4b41-4e57-ae94-a3517d382e7d","Type":"ContainerStarted","Data":"3596b0e35d800311d0fab21e850a664942c9e2a832f0f57671cbc2caf7f01a08"} Nov 24 09:17:30 crc kubenswrapper[4563]: I1124 09:17:30.111996 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qtfnl" event={"ID":"241e854a-eb29-4933-98be-bad6b9295260","Type":"ContainerStarted","Data":"3e8c44b73e512d69791babaa0975f29d59d79278c9fc063806fd2914b1dfbd22"} Nov 24 09:17:30 crc kubenswrapper[4563]: I1124 09:17:30.113626 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"bdeec6b1-05d8-4275-839f-a02e22e26f61","Type":"ContainerStarted","Data":"8eaaa92da9bfb74182b248aaf7e3e11952a2549eb4903efaead580bf944ae1f5"} Nov 24 09:17:30 crc kubenswrapper[4563]: I1124 09:17:30.115442 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e4286a17-bf24-4c91-91cb-6e3f3d731d24","Type":"ContainerStarted","Data":"73d417eaecd31e6125b6cbd867865adffe9c7cd078cefc1bbdf0d4e6e9eec4e1"} Nov 24 09:17:30 crc kubenswrapper[4563]: I1124 09:17:30.125265 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3399d213-46c4-42c1-9d69-26246c4ed771","Type":"ContainerStarted","Data":"04a6529fb258c099991bed2405554a357605592fe3676b66647c87dcbe19d2be"} Nov 24 09:17:30 crc kubenswrapper[4563]: I1124 09:17:30.127299 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"18ec698b-354c-4d4e-9126-16c493474617","Type":"ContainerStarted","Data":"92de487af76ff9936312b95bbfc2bef78c4d67ca7362338d5d1ace860caa89c0"} Nov 24 09:17:30 crc kubenswrapper[4563]: W1124 09:17:30.172512 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01d7f46a_ff30_4904_a63a_8d41cea54dd7.slice/crio-87a1ef934200cfddec43526a38a08c0eecd52dfdb06fcfa8727ac4c75b36b722 WatchSource:0}: Error finding container 87a1ef934200cfddec43526a38a08c0eecd52dfdb06fcfa8727ac4c75b36b722: Status 404 returned error can't find the container with id 87a1ef934200cfddec43526a38a08c0eecd52dfdb06fcfa8727ac4c75b36b722 Nov 24 09:17:31 crc kubenswrapper[4563]: I1124 09:17:31.071652 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d574c25-9f39-4836-952d-f75bbfd7ae95" path="/var/lib/kubelet/pods/0d574c25-9f39-4836-952d-f75bbfd7ae95/volumes" Nov 24 09:17:31 crc kubenswrapper[4563]: I1124 09:17:31.072540 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a09025e4-08f3-453a-8356-3ef3eba4b04d" path="/var/lib/kubelet/pods/a09025e4-08f3-453a-8356-3ef3eba4b04d/volumes" Nov 24 09:17:31 crc kubenswrapper[4563]: I1124 09:17:31.073731 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0fc0729-ebad-4f5e-9599-c04cc45fdfe7" path="/var/lib/kubelet/pods/d0fc0729-ebad-4f5e-9599-c04cc45fdfe7/volumes" Nov 24 09:17:31 crc kubenswrapper[4563]: I1124 09:17:31.136276 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6z24f" event={"ID":"01d7f46a-ff30-4904-a63a-8d41cea54dd7","Type":"ContainerStarted","Data":"87a1ef934200cfddec43526a38a08c0eecd52dfdb06fcfa8727ac4c75b36b722"} Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.194128 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8d28277e-c9e2-4e14-bcda-b8e7684ce6f2","Type":"ContainerStarted","Data":"ee4389b6221b7cf681e00888ee60280859a24a1c697b600bfd3959daf6d7fb54"} Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.194753 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.196271 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2c2b6368-21fd-4c13-b008-5fe4be95dc8d","Type":"ContainerStarted","Data":"53b21ab1596405f1e7046c78f9f490af711375ac313c43583abf03ea35a908fb"} Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.198039 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65","Type":"ContainerStarted","Data":"3af3af39224a2878005274e87a0b730dc104a0ed6dfa0ce4b4f1ecc9cc231ebf"} Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.199591 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"04022879-4b41-4e57-ae94-a3517d382e7d","Type":"ContainerStarted","Data":"9fe674fb9c9c101faeb7b96b91dadbe5aa2260127bb16462b153f7f11e66f2b5"} Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.200930 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qtfnl" event={"ID":"241e854a-eb29-4933-98be-bad6b9295260","Type":"ContainerStarted","Data":"045320426b93fce4f5b2e515f32420b5cdcdc41653cb1216ac8612d92e2a772d"} Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.201017 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-qtfnl" Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.202356 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"bdeec6b1-05d8-4275-839f-a02e22e26f61","Type":"ContainerStarted","Data":"802cc09a6ae459b89685338b5560ba6dd8b49d9001868f4a3222086d422554f6"} Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.202468 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.204132 4563 generic.go:334] "Generic (PLEG): container finished" podID="01d7f46a-ff30-4904-a63a-8d41cea54dd7" containerID="165b6dbc4ea8a0e047fa0bcd87f44e7547eb186d40add0a676942c895db3d45b" exitCode=0 Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.204182 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6z24f" event={"ID":"01d7f46a-ff30-4904-a63a-8d41cea54dd7","Type":"ContainerDied","Data":"165b6dbc4ea8a0e047fa0bcd87f44e7547eb186d40add0a676942c895db3d45b"} Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.205614 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3399d213-46c4-42c1-9d69-26246c4ed771","Type":"ContainerStarted","Data":"19c08bca7ff976c4d8350c0cefcc1ddf26716cca4a9b4397d4653e5456bea320"} Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.227128 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.021626581 podStartE2EDuration="18.227116591s" podCreationTimestamp="2025-11-24 09:17:17 +0000 UTC" firstStartedPulling="2025-11-24 09:17:29.348416338 +0000 UTC m=+826.607393786" lastFinishedPulling="2025-11-24 09:17:34.55390635 +0000 UTC m=+831.812883796" observedRunningTime="2025-11-24 09:17:35.218662724 +0000 UTC m=+832.477640171" watchObservedRunningTime="2025-11-24 09:17:35.227116591 +0000 UTC m=+832.486094038" Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.283063 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.007680071 podStartE2EDuration="20.283043002s" podCreationTimestamp="2025-11-24 09:17:15 +0000 UTC" firstStartedPulling="2025-11-24 09:17:29.27605193 +0000 UTC m=+826.535029376" lastFinishedPulling="2025-11-24 09:17:34.55141486 +0000 UTC m=+831.810392307" observedRunningTime="2025-11-24 09:17:35.28114354 +0000 UTC m=+832.540120987" watchObservedRunningTime="2025-11-24 09:17:35.283043002 +0000 UTC m=+832.542020450" Nov 24 09:17:35 crc kubenswrapper[4563]: I1124 09:17:35.296832 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-qtfnl" podStartSLOduration=9.073380501 podStartE2EDuration="14.296818758s" podCreationTimestamp="2025-11-24 09:17:21 +0000 UTC" firstStartedPulling="2025-11-24 09:17:29.345740702 +0000 UTC m=+826.604718148" lastFinishedPulling="2025-11-24 09:17:34.569178958 +0000 UTC m=+831.828156405" observedRunningTime="2025-11-24 09:17:35.293830741 +0000 UTC m=+832.552808189" watchObservedRunningTime="2025-11-24 09:17:35.296818758 +0000 UTC m=+832.555796204" Nov 24 09:17:36 crc kubenswrapper[4563]: I1124 09:17:36.218429 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6z24f" event={"ID":"01d7f46a-ff30-4904-a63a-8d41cea54dd7","Type":"ContainerStarted","Data":"d44054bf4024984e75a9d52bb5faf73ace899b5dc10fc8f14fa218f17bb734e9"} Nov 24 09:17:36 crc kubenswrapper[4563]: I1124 09:17:36.218953 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6z24f" event={"ID":"01d7f46a-ff30-4904-a63a-8d41cea54dd7","Type":"ContainerStarted","Data":"09830da540b13a4aab870750c76cc0826a6ff394258a94d767ba56c458bf623e"} Nov 24 09:17:36 crc kubenswrapper[4563]: I1124 09:17:36.243067 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-6z24f" podStartSLOduration=10.836328665 podStartE2EDuration="15.243035581s" podCreationTimestamp="2025-11-24 09:17:21 +0000 UTC" firstStartedPulling="2025-11-24 09:17:30.174621999 +0000 UTC m=+827.433599447" lastFinishedPulling="2025-11-24 09:17:34.581328916 +0000 UTC m=+831.840306363" observedRunningTime="2025-11-24 09:17:36.233989777 +0000 UTC m=+833.492967246" watchObservedRunningTime="2025-11-24 09:17:36.243035581 +0000 UTC m=+833.502013029" Nov 24 09:17:37 crc kubenswrapper[4563]: I1124 09:17:37.151948 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:37 crc kubenswrapper[4563]: I1124 09:17:37.152373 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:17:38 crc kubenswrapper[4563]: I1124 09:17:38.239738 4563 generic.go:334] "Generic (PLEG): container finished" podID="2c2b6368-21fd-4c13-b008-5fe4be95dc8d" containerID="53b21ab1596405f1e7046c78f9f490af711375ac313c43583abf03ea35a908fb" exitCode=0 Nov 24 09:17:38 crc kubenswrapper[4563]: I1124 09:17:38.239763 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2c2b6368-21fd-4c13-b008-5fe4be95dc8d","Type":"ContainerDied","Data":"53b21ab1596405f1e7046c78f9f490af711375ac313c43583abf03ea35a908fb"} Nov 24 09:17:38 crc kubenswrapper[4563]: I1124 09:17:38.242076 4563 generic.go:334] "Generic (PLEG): container finished" podID="b0de325e-9aea-4ee2-9cc4-093f3d8d3f65" containerID="3af3af39224a2878005274e87a0b730dc104a0ed6dfa0ce4b4f1ecc9cc231ebf" exitCode=0 Nov 24 09:17:38 crc kubenswrapper[4563]: I1124 09:17:38.242099 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65","Type":"ContainerDied","Data":"3af3af39224a2878005274e87a0b730dc104a0ed6dfa0ce4b4f1ecc9cc231ebf"} Nov 24 09:17:38 crc kubenswrapper[4563]: I1124 09:17:38.243854 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"04022879-4b41-4e57-ae94-a3517d382e7d","Type":"ContainerStarted","Data":"3f8f977814dc1f3f3943a6b1004fc0e2a6db702ae3338a85e9a711959b403402"} Nov 24 09:17:38 crc kubenswrapper[4563]: I1124 09:17:38.246407 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3399d213-46c4-42c1-9d69-26246c4ed771","Type":"ContainerStarted","Data":"c4f96acc798df9883400cf943704efc8efc2ce1c9c28a74208dad7c67a3c9d47"} Nov 24 09:17:38 crc kubenswrapper[4563]: I1124 09:17:38.320185 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.562993811 podStartE2EDuration="14.320124386s" podCreationTimestamp="2025-11-24 09:17:24 +0000 UTC" firstStartedPulling="2025-11-24 09:17:29.465878903 +0000 UTC m=+826.724856351" lastFinishedPulling="2025-11-24 09:17:37.223009479 +0000 UTC m=+834.481986926" observedRunningTime="2025-11-24 09:17:38.286785279 +0000 UTC m=+835.545762726" watchObservedRunningTime="2025-11-24 09:17:38.320124386 +0000 UTC m=+835.579101833" Nov 24 09:17:38 crc kubenswrapper[4563]: I1124 09:17:38.328300 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=9.400998245 podStartE2EDuration="17.328272726s" podCreationTimestamp="2025-11-24 09:17:21 +0000 UTC" firstStartedPulling="2025-11-24 09:17:29.292146849 +0000 UTC m=+826.551124296" lastFinishedPulling="2025-11-24 09:17:37.219421331 +0000 UTC m=+834.478398777" observedRunningTime="2025-11-24 09:17:38.323125496 +0000 UTC m=+835.582102953" watchObservedRunningTime="2025-11-24 09:17:38.328272726 +0000 UTC m=+835.587250173" Nov 24 09:17:38 crc kubenswrapper[4563]: I1124 09:17:38.987182 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:17:38 crc kubenswrapper[4563]: I1124 09:17:38.987561 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:17:38 crc kubenswrapper[4563]: I1124 09:17:38.987617 4563 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:17:38 crc kubenswrapper[4563]: I1124 09:17:38.988312 4563 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6964b83d094213dce29e8bc08bcdab313730109f615c70e3b48ab3147ba318f2"} pod="openshift-machine-config-operator/machine-config-daemon-stlxr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 09:17:38 crc kubenswrapper[4563]: I1124 09:17:38.988370 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" containerID="cri-o://6964b83d094213dce29e8bc08bcdab313730109f615c70e3b48ab3147ba318f2" gracePeriod=600 Nov 24 09:17:39 crc kubenswrapper[4563]: I1124 09:17:39.262877 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2c2b6368-21fd-4c13-b008-5fe4be95dc8d","Type":"ContainerStarted","Data":"35f615cfed611191b947c693e69c585c5ff17e4ff4bfc0aa792ca75490ee7372"} Nov 24 09:17:39 crc kubenswrapper[4563]: I1124 09:17:39.266055 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b0de325e-9aea-4ee2-9cc4-093f3d8d3f65","Type":"ContainerStarted","Data":"7234d2c079e428098fcfe2bb5e88250f00a5047c197053e0ecfe0aab5b7c3af9"} Nov 24 09:17:39 crc kubenswrapper[4563]: I1124 09:17:39.268475 4563 generic.go:334] "Generic (PLEG): container finished" podID="3b2bfe55-8989-49b3-bb61-e28189447627" containerID="6964b83d094213dce29e8bc08bcdab313730109f615c70e3b48ab3147ba318f2" exitCode=0 Nov 24 09:17:39 crc kubenswrapper[4563]: I1124 09:17:39.268536 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerDied","Data":"6964b83d094213dce29e8bc08bcdab313730109f615c70e3b48ab3147ba318f2"} Nov 24 09:17:39 crc kubenswrapper[4563]: I1124 09:17:39.268577 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"03f875f88eef557bff28f5ed0a9f361fdac2df81584f5d9cbef7e181ab4ba280"} Nov 24 09:17:39 crc kubenswrapper[4563]: I1124 09:17:39.268609 4563 scope.go:117] "RemoveContainer" containerID="70f9c1996df47923e8f844e33d7bfcc71ba8679e22cd73a43ba1a4424e2bc93b" Nov 24 09:17:39 crc kubenswrapper[4563]: I1124 09:17:39.286108 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.744860277 podStartE2EDuration="25.286087035s" podCreationTimestamp="2025-11-24 09:17:14 +0000 UTC" firstStartedPulling="2025-11-24 09:17:29.012522685 +0000 UTC m=+826.271500132" lastFinishedPulling="2025-11-24 09:17:34.553749443 +0000 UTC m=+831.812726890" observedRunningTime="2025-11-24 09:17:39.284935663 +0000 UTC m=+836.543913110" watchObservedRunningTime="2025-11-24 09:17:39.286087035 +0000 UTC m=+836.545064482" Nov 24 09:17:39 crc kubenswrapper[4563]: I1124 09:17:39.309962 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.004290761 podStartE2EDuration="26.309947332s" podCreationTimestamp="2025-11-24 09:17:13 +0000 UTC" firstStartedPulling="2025-11-24 09:17:29.263528207 +0000 UTC m=+826.522505644" lastFinishedPulling="2025-11-24 09:17:34.569184769 +0000 UTC m=+831.828162215" observedRunningTime="2025-11-24 09:17:39.303035946 +0000 UTC m=+836.562013393" watchObservedRunningTime="2025-11-24 09:17:39.309947332 +0000 UTC m=+836.568924769" Nov 24 09:17:40 crc kubenswrapper[4563]: I1124 09:17:40.742971 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:40 crc kubenswrapper[4563]: I1124 09:17:40.744101 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:40 crc kubenswrapper[4563]: I1124 09:17:40.777025 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.065604 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.092581 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.230819 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.288541 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.323458 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.328024 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.555762 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-zhm5r"] Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.578042 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65c9b8d4f7-7gcks"] Nov 24 09:17:41 crc kubenswrapper[4563]: E1124 09:17:41.578447 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09025e4-08f3-453a-8356-3ef3eba4b04d" containerName="extract-utilities" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.578471 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09025e4-08f3-453a-8356-3ef3eba4b04d" containerName="extract-utilities" Nov 24 09:17:41 crc kubenswrapper[4563]: E1124 09:17:41.578491 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09025e4-08f3-453a-8356-3ef3eba4b04d" containerName="registry-server" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.578499 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09025e4-08f3-453a-8356-3ef3eba4b04d" containerName="registry-server" Nov 24 09:17:41 crc kubenswrapper[4563]: E1124 09:17:41.578511 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757c88fd-7bed-4810-a374-f1246f058983" containerName="extract-content" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.578516 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="757c88fd-7bed-4810-a374-f1246f058983" containerName="extract-content" Nov 24 09:17:41 crc kubenswrapper[4563]: E1124 09:17:41.578546 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757c88fd-7bed-4810-a374-f1246f058983" containerName="extract-utilities" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.578572 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="757c88fd-7bed-4810-a374-f1246f058983" containerName="extract-utilities" Nov 24 09:17:41 crc kubenswrapper[4563]: E1124 09:17:41.578585 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09025e4-08f3-453a-8356-3ef3eba4b04d" containerName="extract-content" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.578591 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09025e4-08f3-453a-8356-3ef3eba4b04d" containerName="extract-content" Nov 24 09:17:41 crc kubenswrapper[4563]: E1124 09:17:41.578682 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757c88fd-7bed-4810-a374-f1246f058983" containerName="registry-server" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.578693 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="757c88fd-7bed-4810-a374-f1246f058983" containerName="registry-server" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.578885 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="a09025e4-08f3-453a-8356-3ef3eba4b04d" containerName="registry-server" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.578907 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="757c88fd-7bed-4810-a374-f1246f058983" containerName="registry-server" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.581476 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.584757 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.586690 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c9b8d4f7-7gcks"] Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.600876 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-cblxg"] Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.602003 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.604715 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.614095 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d5d57856-0858-4ef6-86b1-282d4bc462be-ovn-rundir\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.614135 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5d57856-0858-4ef6-86b1-282d4bc462be-config\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.614178 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d5d57856-0858-4ef6-86b1-282d4bc462be-ovs-rundir\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.614275 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d57856-0858-4ef6-86b1-282d4bc462be-combined-ca-bundle\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.614339 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d57856-0858-4ef6-86b1-282d4bc462be-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.614420 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-ovsdbserver-sb\") pod \"dnsmasq-dns-65c9b8d4f7-7gcks\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.614469 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8jqh\" (UniqueName: \"kubernetes.io/projected/d5d57856-0858-4ef6-86b1-282d4bc462be-kube-api-access-x8jqh\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.614528 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-dns-svc\") pod \"dnsmasq-dns-65c9b8d4f7-7gcks\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.614556 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gjkg\" (UniqueName: \"kubernetes.io/projected/af3bd664-81f7-40fa-a654-deaefa63e7e0-kube-api-access-7gjkg\") pod \"dnsmasq-dns-65c9b8d4f7-7gcks\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.614615 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-config\") pod \"dnsmasq-dns-65c9b8d4f7-7gcks\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.656859 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cblxg"] Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.715835 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8jqh\" (UniqueName: \"kubernetes.io/projected/d5d57856-0858-4ef6-86b1-282d4bc462be-kube-api-access-x8jqh\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.715949 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-dns-svc\") pod \"dnsmasq-dns-65c9b8d4f7-7gcks\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.716027 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gjkg\" (UniqueName: \"kubernetes.io/projected/af3bd664-81f7-40fa-a654-deaefa63e7e0-kube-api-access-7gjkg\") pod \"dnsmasq-dns-65c9b8d4f7-7gcks\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.716126 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-config\") pod \"dnsmasq-dns-65c9b8d4f7-7gcks\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.716209 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d5d57856-0858-4ef6-86b1-282d4bc462be-ovn-rundir\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.716282 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5d57856-0858-4ef6-86b1-282d4bc462be-config\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.716354 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d5d57856-0858-4ef6-86b1-282d4bc462be-ovs-rundir\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.716437 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d57856-0858-4ef6-86b1-282d4bc462be-combined-ca-bundle\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.716525 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d57856-0858-4ef6-86b1-282d4bc462be-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.716622 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-ovsdbserver-sb\") pod \"dnsmasq-dns-65c9b8d4f7-7gcks\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.717030 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-dns-svc\") pod \"dnsmasq-dns-65c9b8d4f7-7gcks\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.717549 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5d57856-0858-4ef6-86b1-282d4bc462be-config\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.717742 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-config\") pod \"dnsmasq-dns-65c9b8d4f7-7gcks\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.717915 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d5d57856-0858-4ef6-86b1-282d4bc462be-ovs-rundir\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.718312 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d5d57856-0858-4ef6-86b1-282d4bc462be-ovn-rundir\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.721189 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-ovsdbserver-sb\") pod \"dnsmasq-dns-65c9b8d4f7-7gcks\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.721391 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d57856-0858-4ef6-86b1-282d4bc462be-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.723119 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d57856-0858-4ef6-86b1-282d4bc462be-combined-ca-bundle\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.731551 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8jqh\" (UniqueName: \"kubernetes.io/projected/d5d57856-0858-4ef6-86b1-282d4bc462be-kube-api-access-x8jqh\") pod \"ovn-controller-metrics-cblxg\" (UID: \"d5d57856-0858-4ef6-86b1-282d4bc462be\") " pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.732475 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gjkg\" (UniqueName: \"kubernetes.io/projected/af3bd664-81f7-40fa-a654-deaefa63e7e0-kube-api-access-7gjkg\") pod \"dnsmasq-dns-65c9b8d4f7-7gcks\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.879657 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-d7bd7"] Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.898319 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c476d78c5-lxpcq"] Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.898781 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.899490 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.901934 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.918092 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-cblxg" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.918140 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c476d78c5-lxpcq"] Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.924397 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.924463 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.924492 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-config\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.924510 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x578\" (UniqueName: \"kubernetes.io/projected/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-kube-api-access-9x578\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.924530 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-dns-svc\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.960571 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.961742 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.962942 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.963896 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.964058 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-bgdlf" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.964122 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 24 09:17:41 crc kubenswrapper[4563]: I1124 09:17:41.977856 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.028119 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c78e2b4d-f2bf-435e-b163-c9415021f43c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.028342 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ggd7\" (UniqueName: \"kubernetes.io/projected/c78e2b4d-f2bf-435e-b163-c9415021f43c-kube-api-access-8ggd7\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.028370 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c78e2b4d-f2bf-435e-b163-c9415021f43c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.028400 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c78e2b4d-f2bf-435e-b163-c9415021f43c-scripts\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.028421 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c78e2b4d-f2bf-435e-b163-c9415021f43c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.028444 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c78e2b4d-f2bf-435e-b163-c9415021f43c-config\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.028486 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.028526 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.028545 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-config\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.029602 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.030099 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.030571 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-config\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.028560 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x578\" (UniqueName: \"kubernetes.io/projected/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-kube-api-access-9x578\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.030614 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-dns-svc\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.030655 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c78e2b4d-f2bf-435e-b163-c9415021f43c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.031147 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-dns-svc\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.065565 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x578\" (UniqueName: \"kubernetes.io/projected/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-kube-api-access-9x578\") pod \"dnsmasq-dns-5c476d78c5-lxpcq\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.139385 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c78e2b4d-f2bf-435e-b163-c9415021f43c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.139446 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c78e2b4d-f2bf-435e-b163-c9415021f43c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.139468 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ggd7\" (UniqueName: \"kubernetes.io/projected/c78e2b4d-f2bf-435e-b163-c9415021f43c-kube-api-access-8ggd7\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.139501 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c78e2b4d-f2bf-435e-b163-c9415021f43c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.139573 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c78e2b4d-f2bf-435e-b163-c9415021f43c-scripts\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.139609 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c78e2b4d-f2bf-435e-b163-c9415021f43c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.139659 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c78e2b4d-f2bf-435e-b163-c9415021f43c-config\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.140476 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c78e2b4d-f2bf-435e-b163-c9415021f43c-config\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.142019 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c78e2b4d-f2bf-435e-b163-c9415021f43c-scripts\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.144768 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c78e2b4d-f2bf-435e-b163-c9415021f43c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.148511 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c78e2b4d-f2bf-435e-b163-c9415021f43c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.149525 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c78e2b4d-f2bf-435e-b163-c9415021f43c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.156172 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c78e2b4d-f2bf-435e-b163-c9415021f43c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.172249 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ggd7\" (UniqueName: \"kubernetes.io/projected/c78e2b4d-f2bf-435e-b163-c9415021f43c-kube-api-access-8ggd7\") pod \"ovn-northd-0\" (UID: \"c78e2b4d-f2bf-435e-b163-c9415021f43c\") " pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.264490 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-cblxg"] Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.294657 4563 generic.go:334] "Generic (PLEG): container finished" podID="2a278b53-bccd-4b13-aeaf-14674dacdb41" containerID="bf84338a17b058ad9bd8e0073c93e55b467f9f88ac8790bc4032679be98fcab8" exitCode=0 Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.294696 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" event={"ID":"2a278b53-bccd-4b13-aeaf-14674dacdb41","Type":"ContainerDied","Data":"bf84338a17b058ad9bd8e0073c93e55b467f9f88ac8790bc4032679be98fcab8"} Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.296294 4563 generic.go:334] "Generic (PLEG): container finished" podID="cf409104-ddfd-4643-a35c-3c34c6ce2d14" containerID="90c6c8e7be02164e2bb4a19c3d54d125ec56e1f2366a2c15d0e7cbda482df4f2" exitCode=0 Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.296346 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" event={"ID":"cf409104-ddfd-4643-a35c-3c34c6ce2d14","Type":"ContainerDied","Data":"90c6c8e7be02164e2bb4a19c3d54d125ec56e1f2366a2c15d0e7cbda482df4f2"} Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.296796 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.298126 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cblxg" event={"ID":"d5d57856-0858-4ef6-86b1-282d4bc462be","Type":"ContainerStarted","Data":"f1d0de5e0aa19faf8b23ba002b3550f55942c211446b037c12b5cff8d58fb1aa"} Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.312307 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.468567 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c9b8d4f7-7gcks"] Nov 24 09:17:42 crc kubenswrapper[4563]: W1124 09:17:42.473269 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf3bd664_81f7_40fa_a654_deaefa63e7e0.slice/crio-860025f603a81ab7984c14231fe834ef16b35885b8df8acc1eee19169a109400 WatchSource:0}: Error finding container 860025f603a81ab7984c14231fe834ef16b35885b8df8acc1eee19169a109400: Status 404 returned error can't find the container with id 860025f603a81ab7984c14231fe834ef16b35885b8df8acc1eee19169a109400 Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.611087 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.653486 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzjdx\" (UniqueName: \"kubernetes.io/projected/2a278b53-bccd-4b13-aeaf-14674dacdb41-kube-api-access-xzjdx\") pod \"2a278b53-bccd-4b13-aeaf-14674dacdb41\" (UID: \"2a278b53-bccd-4b13-aeaf-14674dacdb41\") " Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.653738 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a278b53-bccd-4b13-aeaf-14674dacdb41-dns-svc\") pod \"2a278b53-bccd-4b13-aeaf-14674dacdb41\" (UID: \"2a278b53-bccd-4b13-aeaf-14674dacdb41\") " Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.653801 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a278b53-bccd-4b13-aeaf-14674dacdb41-config\") pod \"2a278b53-bccd-4b13-aeaf-14674dacdb41\" (UID: \"2a278b53-bccd-4b13-aeaf-14674dacdb41\") " Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.657842 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a278b53-bccd-4b13-aeaf-14674dacdb41-kube-api-access-xzjdx" (OuterVolumeSpecName: "kube-api-access-xzjdx") pod "2a278b53-bccd-4b13-aeaf-14674dacdb41" (UID: "2a278b53-bccd-4b13-aeaf-14674dacdb41"). InnerVolumeSpecName "kube-api-access-xzjdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.677289 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a278b53-bccd-4b13-aeaf-14674dacdb41-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2a278b53-bccd-4b13-aeaf-14674dacdb41" (UID: "2a278b53-bccd-4b13-aeaf-14674dacdb41"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.681347 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a278b53-bccd-4b13-aeaf-14674dacdb41-config" (OuterVolumeSpecName: "config") pod "2a278b53-bccd-4b13-aeaf-14674dacdb41" (UID: "2a278b53-bccd-4b13-aeaf-14674dacdb41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.755719 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzjdx\" (UniqueName: \"kubernetes.io/projected/2a278b53-bccd-4b13-aeaf-14674dacdb41-kube-api-access-xzjdx\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.755752 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a278b53-bccd-4b13-aeaf-14674dacdb41-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.755761 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a278b53-bccd-4b13-aeaf-14674dacdb41-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.803339 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c476d78c5-lxpcq"] Nov 24 09:17:42 crc kubenswrapper[4563]: W1124 09:17:42.815401 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc78e2b4d_f2bf_435e_b163_c9415021f43c.slice/crio-e0e2f7c1d12eb1b972d238ba10f4de5e91d0c63341066b3c8eb208d3422d8e1e WatchSource:0}: Error finding container e0e2f7c1d12eb1b972d238ba10f4de5e91d0c63341066b3c8eb208d3422d8e1e: Status 404 returned error can't find the container with id e0e2f7c1d12eb1b972d238ba10f4de5e91d0c63341066b3c8eb208d3422d8e1e Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.817244 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.859316 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.959562 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbl4n\" (UniqueName: \"kubernetes.io/projected/cf409104-ddfd-4643-a35c-3c34c6ce2d14-kube-api-access-cbl4n\") pod \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\" (UID: \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\") " Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.959791 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf409104-ddfd-4643-a35c-3c34c6ce2d14-config\") pod \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\" (UID: \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\") " Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.960057 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf409104-ddfd-4643-a35c-3c34c6ce2d14-dns-svc\") pod \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\" (UID: \"cf409104-ddfd-4643-a35c-3c34c6ce2d14\") " Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.964094 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf409104-ddfd-4643-a35c-3c34c6ce2d14-kube-api-access-cbl4n" (OuterVolumeSpecName: "kube-api-access-cbl4n") pod "cf409104-ddfd-4643-a35c-3c34c6ce2d14" (UID: "cf409104-ddfd-4643-a35c-3c34c6ce2d14"). InnerVolumeSpecName "kube-api-access-cbl4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.974183 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf409104-ddfd-4643-a35c-3c34c6ce2d14-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf409104-ddfd-4643-a35c-3c34c6ce2d14" (UID: "cf409104-ddfd-4643-a35c-3c34c6ce2d14"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:42 crc kubenswrapper[4563]: I1124 09:17:42.976432 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf409104-ddfd-4643-a35c-3c34c6ce2d14-config" (OuterVolumeSpecName: "config") pod "cf409104-ddfd-4643-a35c-3c34c6ce2d14" (UID: "cf409104-ddfd-4643-a35c-3c34c6ce2d14"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.061962 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbl4n\" (UniqueName: \"kubernetes.io/projected/cf409104-ddfd-4643-a35c-3c34c6ce2d14-kube-api-access-cbl4n\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.061991 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf409104-ddfd-4643-a35c-3c34c6ce2d14-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.062001 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf409104-ddfd-4643-a35c-3c34c6ce2d14-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.308473 4563 generic.go:334] "Generic (PLEG): container finished" podID="af3bd664-81f7-40fa-a654-deaefa63e7e0" containerID="7a97bf4e7620f71590aada0c3984fcc01da8826c9eec7b69f8bbc90f0b40308e" exitCode=0 Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.308544 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" event={"ID":"af3bd664-81f7-40fa-a654-deaefa63e7e0","Type":"ContainerDied","Data":"7a97bf4e7620f71590aada0c3984fcc01da8826c9eec7b69f8bbc90f0b40308e"} Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.308579 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" event={"ID":"af3bd664-81f7-40fa-a654-deaefa63e7e0","Type":"ContainerStarted","Data":"860025f603a81ab7984c14231fe834ef16b35885b8df8acc1eee19169a109400"} Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.311537 4563 generic.go:334] "Generic (PLEG): container finished" podID="4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a" containerID="3ee6e9ca5ed68a65824df938c7e9cbea93b13d02d7f70cffa57d8b186c29113f" exitCode=0 Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.312508 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" event={"ID":"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a","Type":"ContainerDied","Data":"3ee6e9ca5ed68a65824df938c7e9cbea93b13d02d7f70cffa57d8b186c29113f"} Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.312764 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" event={"ID":"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a","Type":"ContainerStarted","Data":"f8f29aed2a5fefbe61532e4298057e598a37f5f8bc2b80633b7dbd62d1ebb336"} Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.315205 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" event={"ID":"2a278b53-bccd-4b13-aeaf-14674dacdb41","Type":"ContainerDied","Data":"1e49619eb37ef4b2e140fad8fe4e5c1ba67378fc744dfd898362836910d05fdf"} Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.315318 4563 scope.go:117] "RemoveContainer" containerID="bf84338a17b058ad9bd8e0073c93e55b467f9f88ac8790bc4032679be98fcab8" Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.315782 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c6d9948dc-d7bd7" Nov 24 09:17:43 crc kubenswrapper[4563]: E1124 09:17:43.319932 4563 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.26.216:55320->192.168.26.216:44383: write tcp 192.168.26.216:55320->192.168.26.216:44383: write: broken pipe Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.323141 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.323195 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-zhm5r" event={"ID":"cf409104-ddfd-4643-a35c-3c34c6ce2d14","Type":"ContainerDied","Data":"41adc451abd9e333d33c67feed1d720d586b413e1bc29163e3bb697cb0c7e9d3"} Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.330836 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-cblxg" event={"ID":"d5d57856-0858-4ef6-86b1-282d4bc462be","Type":"ContainerStarted","Data":"b45c8b27c11b809e28e9bc3b45019e0eec27b3b6a9643563c23051af792bca7a"} Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.337130 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c78e2b4d-f2bf-435e-b163-c9415021f43c","Type":"ContainerStarted","Data":"e0e2f7c1d12eb1b972d238ba10f4de5e91d0c63341066b3c8eb208d3422d8e1e"} Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.354676 4563 scope.go:117] "RemoveContainer" containerID="90c6c8e7be02164e2bb4a19c3d54d125ec56e1f2366a2c15d0e7cbda482df4f2" Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.420590 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-d7bd7"] Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.424674 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-d7bd7"] Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.438560 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-zhm5r"] Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.440398 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-zhm5r"] Nov 24 09:17:43 crc kubenswrapper[4563]: I1124 09:17:43.441463 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-cblxg" podStartSLOduration=2.441445292 podStartE2EDuration="2.441445292s" podCreationTimestamp="2025-11-24 09:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:17:43.413769608 +0000 UTC m=+840.672747055" watchObservedRunningTime="2025-11-24 09:17:43.441445292 +0000 UTC m=+840.700422740" Nov 24 09:17:44 crc kubenswrapper[4563]: I1124 09:17:44.346382 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c78e2b4d-f2bf-435e-b163-c9415021f43c","Type":"ContainerStarted","Data":"392d913d51da7087b02d58b54ab0130a8b89cdddaafe860cab4c80b15ca993f3"} Nov 24 09:17:44 crc kubenswrapper[4563]: I1124 09:17:44.346756 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c78e2b4d-f2bf-435e-b163-c9415021f43c","Type":"ContainerStarted","Data":"86d4c42984e8f3f8a05f945c7e12bd7bc70ad000f322e1abb41f8c9e362d6fa8"} Nov 24 09:17:44 crc kubenswrapper[4563]: I1124 09:17:44.346832 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 24 09:17:44 crc kubenswrapper[4563]: I1124 09:17:44.349039 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" event={"ID":"af3bd664-81f7-40fa-a654-deaefa63e7e0","Type":"ContainerStarted","Data":"b6ab5a048f3cb70c5fe7d3e8b44e2006f60e00792e629c8fea25386835a6c91a"} Nov 24 09:17:44 crc kubenswrapper[4563]: I1124 09:17:44.349162 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:44 crc kubenswrapper[4563]: I1124 09:17:44.351919 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" event={"ID":"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a","Type":"ContainerStarted","Data":"b263f4323742b4c61ada063990c55b44342de95d15e73e2fed19f592f99c2f3d"} Nov 24 09:17:44 crc kubenswrapper[4563]: I1124 09:17:44.352009 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:44 crc kubenswrapper[4563]: I1124 09:17:44.366513 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.247210254 podStartE2EDuration="3.366496171s" podCreationTimestamp="2025-11-24 09:17:41 +0000 UTC" firstStartedPulling="2025-11-24 09:17:42.817038357 +0000 UTC m=+840.076015794" lastFinishedPulling="2025-11-24 09:17:43.936324264 +0000 UTC m=+841.195301711" observedRunningTime="2025-11-24 09:17:44.362673721 +0000 UTC m=+841.621651169" watchObservedRunningTime="2025-11-24 09:17:44.366496171 +0000 UTC m=+841.625473618" Nov 24 09:17:44 crc kubenswrapper[4563]: I1124 09:17:44.386307 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" podStartSLOduration=3.386289928 podStartE2EDuration="3.386289928s" podCreationTimestamp="2025-11-24 09:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:17:44.382445316 +0000 UTC m=+841.641422763" watchObservedRunningTime="2025-11-24 09:17:44.386289928 +0000 UTC m=+841.645267374" Nov 24 09:17:44 crc kubenswrapper[4563]: I1124 09:17:44.402180 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" podStartSLOduration=3.402163189 podStartE2EDuration="3.402163189s" podCreationTimestamp="2025-11-24 09:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:17:44.396364771 +0000 UTC m=+841.655342228" watchObservedRunningTime="2025-11-24 09:17:44.402163189 +0000 UTC m=+841.661140635" Nov 24 09:17:44 crc kubenswrapper[4563]: I1124 09:17:44.560442 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 24 09:17:44 crc kubenswrapper[4563]: I1124 09:17:44.560545 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 24 09:17:44 crc kubenswrapper[4563]: I1124 09:17:44.623312 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.062804 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a278b53-bccd-4b13-aeaf-14674dacdb41" path="/var/lib/kubelet/pods/2a278b53-bccd-4b13-aeaf-14674dacdb41/volumes" Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.063386 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf409104-ddfd-4643-a35c-3c34c6ce2d14" path="/var/lib/kubelet/pods/cf409104-ddfd-4643-a35c-3c34c6ce2d14/volumes" Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.428716 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.895150 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-aa19-account-create-bp4rh"] Nov 24 09:17:45 crc kubenswrapper[4563]: E1124 09:17:45.895520 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a278b53-bccd-4b13-aeaf-14674dacdb41" containerName="init" Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.895541 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a278b53-bccd-4b13-aeaf-14674dacdb41" containerName="init" Nov 24 09:17:45 crc kubenswrapper[4563]: E1124 09:17:45.895563 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf409104-ddfd-4643-a35c-3c34c6ce2d14" containerName="init" Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.895569 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf409104-ddfd-4643-a35c-3c34c6ce2d14" containerName="init" Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.895755 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a278b53-bccd-4b13-aeaf-14674dacdb41" containerName="init" Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.895776 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf409104-ddfd-4643-a35c-3c34c6ce2d14" containerName="init" Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.896268 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-aa19-account-create-bp4rh" Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.899538 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.910799 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-aa19-account-create-bp4rh"] Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.937784 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-2snzz"] Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.939304 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2snzz" Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.948040 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2snzz"] Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.954374 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:45 crc kubenswrapper[4563]: I1124 09:17:45.954471 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.016532 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.029188 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84jgv\" (UniqueName: \"kubernetes.io/projected/b6005814-b899-4e79-816e-c51ffbe41a91-kube-api-access-84jgv\") pod \"keystone-aa19-account-create-bp4rh\" (UID: \"b6005814-b899-4e79-816e-c51ffbe41a91\") " pod="openstack/keystone-aa19-account-create-bp4rh" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.030080 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6005814-b899-4e79-816e-c51ffbe41a91-operator-scripts\") pod \"keystone-aa19-account-create-bp4rh\" (UID: \"b6005814-b899-4e79-816e-c51ffbe41a91\") " pod="openstack/keystone-aa19-account-create-bp4rh" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.113608 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-2c9nb"] Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.115151 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2c9nb" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.118955 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2c9nb"] Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.131560 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8c3184f-c96d-4339-8d4c-af9b3fa9a04d-operator-scripts\") pod \"keystone-db-create-2snzz\" (UID: \"e8c3184f-c96d-4339-8d4c-af9b3fa9a04d\") " pod="openstack/keystone-db-create-2snzz" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.131704 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6005814-b899-4e79-816e-c51ffbe41a91-operator-scripts\") pod \"keystone-aa19-account-create-bp4rh\" (UID: \"b6005814-b899-4e79-816e-c51ffbe41a91\") " pod="openstack/keystone-aa19-account-create-bp4rh" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.131857 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96dkf\" (UniqueName: \"kubernetes.io/projected/e8c3184f-c96d-4339-8d4c-af9b3fa9a04d-kube-api-access-96dkf\") pod \"keystone-db-create-2snzz\" (UID: \"e8c3184f-c96d-4339-8d4c-af9b3fa9a04d\") " pod="openstack/keystone-db-create-2snzz" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.131881 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84jgv\" (UniqueName: \"kubernetes.io/projected/b6005814-b899-4e79-816e-c51ffbe41a91-kube-api-access-84jgv\") pod \"keystone-aa19-account-create-bp4rh\" (UID: \"b6005814-b899-4e79-816e-c51ffbe41a91\") " pod="openstack/keystone-aa19-account-create-bp4rh" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.134757 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6005814-b899-4e79-816e-c51ffbe41a91-operator-scripts\") pod \"keystone-aa19-account-create-bp4rh\" (UID: \"b6005814-b899-4e79-816e-c51ffbe41a91\") " pod="openstack/keystone-aa19-account-create-bp4rh" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.152382 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84jgv\" (UniqueName: \"kubernetes.io/projected/b6005814-b899-4e79-816e-c51ffbe41a91-kube-api-access-84jgv\") pod \"keystone-aa19-account-create-bp4rh\" (UID: \"b6005814-b899-4e79-816e-c51ffbe41a91\") " pod="openstack/keystone-aa19-account-create-bp4rh" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.213867 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2820-account-create-rzsg6"] Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.215228 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2820-account-create-rzsg6" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.217086 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.220582 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2820-account-create-rzsg6"] Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.233782 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8c3184f-c96d-4339-8d4c-af9b3fa9a04d-operator-scripts\") pod \"keystone-db-create-2snzz\" (UID: \"e8c3184f-c96d-4339-8d4c-af9b3fa9a04d\") " pod="openstack/keystone-db-create-2snzz" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.233897 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flttz\" (UniqueName: \"kubernetes.io/projected/579d7285-3560-4795-8a38-516fd67df1f4-kube-api-access-flttz\") pod \"placement-db-create-2c9nb\" (UID: \"579d7285-3560-4795-8a38-516fd67df1f4\") " pod="openstack/placement-db-create-2c9nb" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.233934 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579d7285-3560-4795-8a38-516fd67df1f4-operator-scripts\") pod \"placement-db-create-2c9nb\" (UID: \"579d7285-3560-4795-8a38-516fd67df1f4\") " pod="openstack/placement-db-create-2c9nb" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.233996 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96dkf\" (UniqueName: \"kubernetes.io/projected/e8c3184f-c96d-4339-8d4c-af9b3fa9a04d-kube-api-access-96dkf\") pod \"keystone-db-create-2snzz\" (UID: \"e8c3184f-c96d-4339-8d4c-af9b3fa9a04d\") " pod="openstack/keystone-db-create-2snzz" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.234377 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8c3184f-c96d-4339-8d4c-af9b3fa9a04d-operator-scripts\") pod \"keystone-db-create-2snzz\" (UID: \"e8c3184f-c96d-4339-8d4c-af9b3fa9a04d\") " pod="openstack/keystone-db-create-2snzz" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.241837 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-aa19-account-create-bp4rh" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.252933 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96dkf\" (UniqueName: \"kubernetes.io/projected/e8c3184f-c96d-4339-8d4c-af9b3fa9a04d-kube-api-access-96dkf\") pod \"keystone-db-create-2snzz\" (UID: \"e8c3184f-c96d-4339-8d4c-af9b3fa9a04d\") " pod="openstack/keystone-db-create-2snzz" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.254777 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2snzz" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.336602 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flttz\" (UniqueName: \"kubernetes.io/projected/579d7285-3560-4795-8a38-516fd67df1f4-kube-api-access-flttz\") pod \"placement-db-create-2c9nb\" (UID: \"579d7285-3560-4795-8a38-516fd67df1f4\") " pod="openstack/placement-db-create-2c9nb" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.338033 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579d7285-3560-4795-8a38-516fd67df1f4-operator-scripts\") pod \"placement-db-create-2c9nb\" (UID: \"579d7285-3560-4795-8a38-516fd67df1f4\") " pod="openstack/placement-db-create-2c9nb" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.338208 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6072ef52-6143-434b-b56f-06e3d11c966f-operator-scripts\") pod \"placement-2820-account-create-rzsg6\" (UID: \"6072ef52-6143-434b-b56f-06e3d11c966f\") " pod="openstack/placement-2820-account-create-rzsg6" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.338835 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58knm\" (UniqueName: \"kubernetes.io/projected/6072ef52-6143-434b-b56f-06e3d11c966f-kube-api-access-58knm\") pod \"placement-2820-account-create-rzsg6\" (UID: \"6072ef52-6143-434b-b56f-06e3d11c966f\") " pod="openstack/placement-2820-account-create-rzsg6" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.339379 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579d7285-3560-4795-8a38-516fd67df1f4-operator-scripts\") pod \"placement-db-create-2c9nb\" (UID: \"579d7285-3560-4795-8a38-516fd67df1f4\") " pod="openstack/placement-db-create-2c9nb" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.356285 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flttz\" (UniqueName: \"kubernetes.io/projected/579d7285-3560-4795-8a38-516fd67df1f4-kube-api-access-flttz\") pod \"placement-db-create-2c9nb\" (UID: \"579d7285-3560-4795-8a38-516fd67df1f4\") " pod="openstack/placement-db-create-2c9nb" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.432871 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2c9nb" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.445378 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58knm\" (UniqueName: \"kubernetes.io/projected/6072ef52-6143-434b-b56f-06e3d11c966f-kube-api-access-58knm\") pod \"placement-2820-account-create-rzsg6\" (UID: \"6072ef52-6143-434b-b56f-06e3d11c966f\") " pod="openstack/placement-2820-account-create-rzsg6" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.445606 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6072ef52-6143-434b-b56f-06e3d11c966f-operator-scripts\") pod \"placement-2820-account-create-rzsg6\" (UID: \"6072ef52-6143-434b-b56f-06e3d11c966f\") " pod="openstack/placement-2820-account-create-rzsg6" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.446231 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6072ef52-6143-434b-b56f-06e3d11c966f-operator-scripts\") pod \"placement-2820-account-create-rzsg6\" (UID: \"6072ef52-6143-434b-b56f-06e3d11c966f\") " pod="openstack/placement-2820-account-create-rzsg6" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.459188 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.459835 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58knm\" (UniqueName: \"kubernetes.io/projected/6072ef52-6143-434b-b56f-06e3d11c966f-kube-api-access-58knm\") pod \"placement-2820-account-create-rzsg6\" (UID: \"6072ef52-6143-434b-b56f-06e3d11c966f\") " pod="openstack/placement-2820-account-create-rzsg6" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.535960 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2820-account-create-rzsg6" Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.642461 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-aa19-account-create-bp4rh"] Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.726809 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2snzz"] Nov 24 09:17:46 crc kubenswrapper[4563]: W1124 09:17:46.736817 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8c3184f_c96d_4339_8d4c_af9b3fa9a04d.slice/crio-f4586fb510c6d9fa4f31256eaed290f599a8eb4d7090e8a4c6a20d08a5121e81 WatchSource:0}: Error finding container f4586fb510c6d9fa4f31256eaed290f599a8eb4d7090e8a4c6a20d08a5121e81: Status 404 returned error can't find the container with id f4586fb510c6d9fa4f31256eaed290f599a8eb4d7090e8a4c6a20d08a5121e81 Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.853481 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2c9nb"] Nov 24 09:17:46 crc kubenswrapper[4563]: W1124 09:17:46.856402 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod579d7285_3560_4795_8a38_516fd67df1f4.slice/crio-d034034165a7386ce92703a928c08396373b518a22f5900b9db2b49f6d2ce96e WatchSource:0}: Error finding container d034034165a7386ce92703a928c08396373b518a22f5900b9db2b49f6d2ce96e: Status 404 returned error can't find the container with id d034034165a7386ce92703a928c08396373b518a22f5900b9db2b49f6d2ce96e Nov 24 09:17:46 crc kubenswrapper[4563]: I1124 09:17:46.925904 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2820-account-create-rzsg6"] Nov 24 09:17:47 crc kubenswrapper[4563]: W1124 09:17:47.065875 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6072ef52_6143_434b_b56f_06e3d11c966f.slice/crio-e8b80f57bce9e05d11678b69f3350bead58ce88a364921af87df01d9672e3d70 WatchSource:0}: Error finding container e8b80f57bce9e05d11678b69f3350bead58ce88a364921af87df01d9672e3d70: Status 404 returned error can't find the container with id e8b80f57bce9e05d11678b69f3350bead58ce88a364921af87df01d9672e3d70 Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.389725 4563 generic.go:334] "Generic (PLEG): container finished" podID="6072ef52-6143-434b-b56f-06e3d11c966f" containerID="d7c7a153898120b18ac46a4295647a7c3f657ef8b1786ddea6ed4b60e30dc468" exitCode=0 Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.389808 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2820-account-create-rzsg6" event={"ID":"6072ef52-6143-434b-b56f-06e3d11c966f","Type":"ContainerDied","Data":"d7c7a153898120b18ac46a4295647a7c3f657ef8b1786ddea6ed4b60e30dc468"} Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.389847 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2820-account-create-rzsg6" event={"ID":"6072ef52-6143-434b-b56f-06e3d11c966f","Type":"ContainerStarted","Data":"e8b80f57bce9e05d11678b69f3350bead58ce88a364921af87df01d9672e3d70"} Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.391866 4563 generic.go:334] "Generic (PLEG): container finished" podID="b6005814-b899-4e79-816e-c51ffbe41a91" containerID="6d7d5cb51b9fda5a5f8a5b4baaedf01d35e6707e1dcf24a352b6d310937270ea" exitCode=0 Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.391940 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-aa19-account-create-bp4rh" event={"ID":"b6005814-b899-4e79-816e-c51ffbe41a91","Type":"ContainerDied","Data":"6d7d5cb51b9fda5a5f8a5b4baaedf01d35e6707e1dcf24a352b6d310937270ea"} Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.391970 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-aa19-account-create-bp4rh" event={"ID":"b6005814-b899-4e79-816e-c51ffbe41a91","Type":"ContainerStarted","Data":"4d1fb504fa520828ca839e2a6b0e6df26c05591ed65a1e2c06b764c72b40c468"} Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.394473 4563 generic.go:334] "Generic (PLEG): container finished" podID="579d7285-3560-4795-8a38-516fd67df1f4" containerID="9b592cd0dae33a2878daf2d9d6185ad96e65f221e44d94981a0809304730d61e" exitCode=0 Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.394531 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2c9nb" event={"ID":"579d7285-3560-4795-8a38-516fd67df1f4","Type":"ContainerDied","Data":"9b592cd0dae33a2878daf2d9d6185ad96e65f221e44d94981a0809304730d61e"} Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.394581 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2c9nb" event={"ID":"579d7285-3560-4795-8a38-516fd67df1f4","Type":"ContainerStarted","Data":"d034034165a7386ce92703a928c08396373b518a22f5900b9db2b49f6d2ce96e"} Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.396285 4563 generic.go:334] "Generic (PLEG): container finished" podID="e8c3184f-c96d-4339-8d4c-af9b3fa9a04d" containerID="2573716fbde1d07a7b809aa931d164137b737d6fe6779f91fe62bbdbb764872a" exitCode=0 Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.396347 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2snzz" event={"ID":"e8c3184f-c96d-4339-8d4c-af9b3fa9a04d","Type":"ContainerDied","Data":"2573716fbde1d07a7b809aa931d164137b737d6fe6779f91fe62bbdbb764872a"} Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.396375 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2snzz" event={"ID":"e8c3184f-c96d-4339-8d4c-af9b3fa9a04d","Type":"ContainerStarted","Data":"f4586fb510c6d9fa4f31256eaed290f599a8eb4d7090e8a4c6a20d08a5121e81"} Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.937975 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c476d78c5-lxpcq"] Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.938235 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" podUID="4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a" containerName="dnsmasq-dns" containerID="cri-o://b263f4323742b4c61ada063990c55b44342de95d15e73e2fed19f592f99c2f3d" gracePeriod=10 Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.964599 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9fdb784c-47fgk"] Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.965844 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:47 crc kubenswrapper[4563]: I1124 09:17:47.981955 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9fdb784c-47fgk"] Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.053379 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.079080 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.079388 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-dns-svc\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.079421 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.079441 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-config\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.079475 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7kvz\" (UniqueName: \"kubernetes.io/projected/46649dc4-4337-4378-a0a1-70b329141a22-kube-api-access-l7kvz\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.181964 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.182137 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-dns-svc\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.182203 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.182231 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-config\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.182324 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7kvz\" (UniqueName: \"kubernetes.io/projected/46649dc4-4337-4378-a0a1-70b329141a22-kube-api-access-l7kvz\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.183204 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.184152 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.184704 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-config\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.184824 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-dns-svc\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.199120 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7kvz\" (UniqueName: \"kubernetes.io/projected/46649dc4-4337-4378-a0a1-70b329141a22-kube-api-access-l7kvz\") pod \"dnsmasq-dns-5c9fdb784c-47fgk\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.289142 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.388386 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.430349 4563 generic.go:334] "Generic (PLEG): container finished" podID="4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a" containerID="b263f4323742b4c61ada063990c55b44342de95d15e73e2fed19f592f99c2f3d" exitCode=0 Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.430592 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.431205 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" event={"ID":"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a","Type":"ContainerDied","Data":"b263f4323742b4c61ada063990c55b44342de95d15e73e2fed19f592f99c2f3d"} Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.431250 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c476d78c5-lxpcq" event={"ID":"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a","Type":"ContainerDied","Data":"f8f29aed2a5fefbe61532e4298057e598a37f5f8bc2b80633b7dbd62d1ebb336"} Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.431281 4563 scope.go:117] "RemoveContainer" containerID="b263f4323742b4c61ada063990c55b44342de95d15e73e2fed19f592f99c2f3d" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.452254 4563 scope.go:117] "RemoveContainer" containerID="3ee6e9ca5ed68a65824df938c7e9cbea93b13d02d7f70cffa57d8b186c29113f" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.489032 4563 scope.go:117] "RemoveContainer" containerID="b263f4323742b4c61ada063990c55b44342de95d15e73e2fed19f592f99c2f3d" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.490161 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-dns-svc\") pod \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.490316 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-config\") pod \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.490508 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-ovsdbserver-nb\") pod \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.490734 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x578\" (UniqueName: \"kubernetes.io/projected/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-kube-api-access-9x578\") pod \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.490845 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-ovsdbserver-sb\") pod \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\" (UID: \"4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a\") " Nov 24 09:17:48 crc kubenswrapper[4563]: E1124 09:17:48.495908 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b263f4323742b4c61ada063990c55b44342de95d15e73e2fed19f592f99c2f3d\": container with ID starting with b263f4323742b4c61ada063990c55b44342de95d15e73e2fed19f592f99c2f3d not found: ID does not exist" containerID="b263f4323742b4c61ada063990c55b44342de95d15e73e2fed19f592f99c2f3d" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.495939 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-kube-api-access-9x578" (OuterVolumeSpecName: "kube-api-access-9x578") pod "4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a" (UID: "4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a"). InnerVolumeSpecName "kube-api-access-9x578". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.495965 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b263f4323742b4c61ada063990c55b44342de95d15e73e2fed19f592f99c2f3d"} err="failed to get container status \"b263f4323742b4c61ada063990c55b44342de95d15e73e2fed19f592f99c2f3d\": rpc error: code = NotFound desc = could not find container \"b263f4323742b4c61ada063990c55b44342de95d15e73e2fed19f592f99c2f3d\": container with ID starting with b263f4323742b4c61ada063990c55b44342de95d15e73e2fed19f592f99c2f3d not found: ID does not exist" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.495995 4563 scope.go:117] "RemoveContainer" containerID="3ee6e9ca5ed68a65824df938c7e9cbea93b13d02d7f70cffa57d8b186c29113f" Nov 24 09:17:48 crc kubenswrapper[4563]: E1124 09:17:48.496360 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee6e9ca5ed68a65824df938c7e9cbea93b13d02d7f70cffa57d8b186c29113f\": container with ID starting with 3ee6e9ca5ed68a65824df938c7e9cbea93b13d02d7f70cffa57d8b186c29113f not found: ID does not exist" containerID="3ee6e9ca5ed68a65824df938c7e9cbea93b13d02d7f70cffa57d8b186c29113f" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.496395 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee6e9ca5ed68a65824df938c7e9cbea93b13d02d7f70cffa57d8b186c29113f"} err="failed to get container status \"3ee6e9ca5ed68a65824df938c7e9cbea93b13d02d7f70cffa57d8b186c29113f\": rpc error: code = NotFound desc = could not find container \"3ee6e9ca5ed68a65824df938c7e9cbea93b13d02d7f70cffa57d8b186c29113f\": container with ID starting with 3ee6e9ca5ed68a65824df938c7e9cbea93b13d02d7f70cffa57d8b186c29113f not found: ID does not exist" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.523525 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a" (UID: "4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.527520 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-config" (OuterVolumeSpecName: "config") pod "4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a" (UID: "4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.534360 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a" (UID: "4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.542408 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a" (UID: "4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.593868 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.593913 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.593924 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.593937 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x578\" (UniqueName: \"kubernetes.io/projected/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-kube-api-access-9x578\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.593950 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.699085 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9fdb784c-47fgk"] Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.738972 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-aa19-account-create-bp4rh" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.772420 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c476d78c5-lxpcq"] Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.777854 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c476d78c5-lxpcq"] Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.876691 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2820-account-create-rzsg6" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.880627 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2c9nb" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.899797 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84jgv\" (UniqueName: \"kubernetes.io/projected/b6005814-b899-4e79-816e-c51ffbe41a91-kube-api-access-84jgv\") pod \"b6005814-b899-4e79-816e-c51ffbe41a91\" (UID: \"b6005814-b899-4e79-816e-c51ffbe41a91\") " Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.900334 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6005814-b899-4e79-816e-c51ffbe41a91-operator-scripts\") pod \"b6005814-b899-4e79-816e-c51ffbe41a91\" (UID: \"b6005814-b899-4e79-816e-c51ffbe41a91\") " Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.900826 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6005814-b899-4e79-816e-c51ffbe41a91-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6005814-b899-4e79-816e-c51ffbe41a91" (UID: "b6005814-b899-4e79-816e-c51ffbe41a91"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.903146 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6005814-b899-4e79-816e-c51ffbe41a91-kube-api-access-84jgv" (OuterVolumeSpecName: "kube-api-access-84jgv") pod "b6005814-b899-4e79-816e-c51ffbe41a91" (UID: "b6005814-b899-4e79-816e-c51ffbe41a91"). InnerVolumeSpecName "kube-api-access-84jgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:17:48 crc kubenswrapper[4563]: I1124 09:17:48.910682 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2snzz" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.001925 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6072ef52-6143-434b-b56f-06e3d11c966f-operator-scripts\") pod \"6072ef52-6143-434b-b56f-06e3d11c966f\" (UID: \"6072ef52-6143-434b-b56f-06e3d11c966f\") " Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.001997 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58knm\" (UniqueName: \"kubernetes.io/projected/6072ef52-6143-434b-b56f-06e3d11c966f-kube-api-access-58knm\") pod \"6072ef52-6143-434b-b56f-06e3d11c966f\" (UID: \"6072ef52-6143-434b-b56f-06e3d11c966f\") " Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.002325 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579d7285-3560-4795-8a38-516fd67df1f4-operator-scripts\") pod \"579d7285-3560-4795-8a38-516fd67df1f4\" (UID: \"579d7285-3560-4795-8a38-516fd67df1f4\") " Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.002379 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6072ef52-6143-434b-b56f-06e3d11c966f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6072ef52-6143-434b-b56f-06e3d11c966f" (UID: "6072ef52-6143-434b-b56f-06e3d11c966f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.002392 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flttz\" (UniqueName: \"kubernetes.io/projected/579d7285-3560-4795-8a38-516fd67df1f4-kube-api-access-flttz\") pod \"579d7285-3560-4795-8a38-516fd67df1f4\" (UID: \"579d7285-3560-4795-8a38-516fd67df1f4\") " Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.002892 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/579d7285-3560-4795-8a38-516fd67df1f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "579d7285-3560-4795-8a38-516fd67df1f4" (UID: "579d7285-3560-4795-8a38-516fd67df1f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.003495 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84jgv\" (UniqueName: \"kubernetes.io/projected/b6005814-b899-4e79-816e-c51ffbe41a91-kube-api-access-84jgv\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.003522 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6072ef52-6143-434b-b56f-06e3d11c966f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.003533 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6005814-b899-4e79-816e-c51ffbe41a91-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.003545 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579d7285-3560-4795-8a38-516fd67df1f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.005250 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6072ef52-6143-434b-b56f-06e3d11c966f-kube-api-access-58knm" (OuterVolumeSpecName: "kube-api-access-58knm") pod "6072ef52-6143-434b-b56f-06e3d11c966f" (UID: "6072ef52-6143-434b-b56f-06e3d11c966f"). InnerVolumeSpecName "kube-api-access-58knm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.005390 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/579d7285-3560-4795-8a38-516fd67df1f4-kube-api-access-flttz" (OuterVolumeSpecName: "kube-api-access-flttz") pod "579d7285-3560-4795-8a38-516fd67df1f4" (UID: "579d7285-3560-4795-8a38-516fd67df1f4"). InnerVolumeSpecName "kube-api-access-flttz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.063306 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a" path="/var/lib/kubelet/pods/4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a/volumes" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.104944 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96dkf\" (UniqueName: \"kubernetes.io/projected/e8c3184f-c96d-4339-8d4c-af9b3fa9a04d-kube-api-access-96dkf\") pod \"e8c3184f-c96d-4339-8d4c-af9b3fa9a04d\" (UID: \"e8c3184f-c96d-4339-8d4c-af9b3fa9a04d\") " Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.105097 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8c3184f-c96d-4339-8d4c-af9b3fa9a04d-operator-scripts\") pod \"e8c3184f-c96d-4339-8d4c-af9b3fa9a04d\" (UID: \"e8c3184f-c96d-4339-8d4c-af9b3fa9a04d\") " Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.108973 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8c3184f-c96d-4339-8d4c-af9b3fa9a04d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8c3184f-c96d-4339-8d4c-af9b3fa9a04d" (UID: "e8c3184f-c96d-4339-8d4c-af9b3fa9a04d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.109898 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58knm\" (UniqueName: \"kubernetes.io/projected/6072ef52-6143-434b-b56f-06e3d11c966f-kube-api-access-58knm\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.109942 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8c3184f-c96d-4339-8d4c-af9b3fa9a04d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.109955 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flttz\" (UniqueName: \"kubernetes.io/projected/579d7285-3560-4795-8a38-516fd67df1f4-kube-api-access-flttz\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.117856 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c3184f-c96d-4339-8d4c-af9b3fa9a04d-kube-api-access-96dkf" (OuterVolumeSpecName: "kube-api-access-96dkf") pod "e8c3184f-c96d-4339-8d4c-af9b3fa9a04d" (UID: "e8c3184f-c96d-4339-8d4c-af9b3fa9a04d"). InnerVolumeSpecName "kube-api-access-96dkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.130813 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 24 09:17:49 crc kubenswrapper[4563]: E1124 09:17:49.131202 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6005814-b899-4e79-816e-c51ffbe41a91" containerName="mariadb-account-create" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.131223 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6005814-b899-4e79-816e-c51ffbe41a91" containerName="mariadb-account-create" Nov 24 09:17:49 crc kubenswrapper[4563]: E1124 09:17:49.131238 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6072ef52-6143-434b-b56f-06e3d11c966f" containerName="mariadb-account-create" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.131246 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="6072ef52-6143-434b-b56f-06e3d11c966f" containerName="mariadb-account-create" Nov 24 09:17:49 crc kubenswrapper[4563]: E1124 09:17:49.131255 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a" containerName="init" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.131268 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a" containerName="init" Nov 24 09:17:49 crc kubenswrapper[4563]: E1124 09:17:49.131281 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a" containerName="dnsmasq-dns" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.131287 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a" containerName="dnsmasq-dns" Nov 24 09:17:49 crc kubenswrapper[4563]: E1124 09:17:49.131304 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="579d7285-3560-4795-8a38-516fd67df1f4" containerName="mariadb-database-create" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.131310 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="579d7285-3560-4795-8a38-516fd67df1f4" containerName="mariadb-database-create" Nov 24 09:17:49 crc kubenswrapper[4563]: E1124 09:17:49.131318 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c3184f-c96d-4339-8d4c-af9b3fa9a04d" containerName="mariadb-database-create" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.131324 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c3184f-c96d-4339-8d4c-af9b3fa9a04d" containerName="mariadb-database-create" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.131586 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="6072ef52-6143-434b-b56f-06e3d11c966f" containerName="mariadb-account-create" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.131603 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dafe0c2-efd2-40ea-8dc1-c6cfbbd84e6a" containerName="dnsmasq-dns" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.131612 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c3184f-c96d-4339-8d4c-af9b3fa9a04d" containerName="mariadb-database-create" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.131664 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6005814-b899-4e79-816e-c51ffbe41a91" containerName="mariadb-account-create" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.131675 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="579d7285-3560-4795-8a38-516fd67df1f4" containerName="mariadb-database-create" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.138831 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.143253 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.143277 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.143288 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.144447 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.145017 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-qf92x" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.218051 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96dkf\" (UniqueName: \"kubernetes.io/projected/e8c3184f-c96d-4339-8d4c-af9b3fa9a04d-kube-api-access-96dkf\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.319594 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9d44fee0-139c-42c9-8ad1-3991121f1d67-lock\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.319705 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9d44fee0-139c-42c9-8ad1-3991121f1d67-cache\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.319742 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.319797 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbpxp\" (UniqueName: \"kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-kube-api-access-tbpxp\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.319861 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.422598 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9d44fee0-139c-42c9-8ad1-3991121f1d67-lock\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.422748 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9d44fee0-139c-42c9-8ad1-3991121f1d67-cache\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.422794 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.422875 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbpxp\" (UniqueName: \"kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-kube-api-access-tbpxp\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.422981 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.423182 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9d44fee0-139c-42c9-8ad1-3991121f1d67-lock\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.423279 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: E1124 09:17:49.423358 4563 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 24 09:17:49 crc kubenswrapper[4563]: E1124 09:17:49.423388 4563 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 24 09:17:49 crc kubenswrapper[4563]: E1124 09:17:49.423463 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift podName:9d44fee0-139c-42c9-8ad1-3991121f1d67 nodeName:}" failed. No retries permitted until 2025-11-24 09:17:49.923436689 +0000 UTC m=+847.182414136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift") pod "swift-storage-0" (UID: "9d44fee0-139c-42c9-8ad1-3991121f1d67") : configmap "swift-ring-files" not found Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.423282 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9d44fee0-139c-42c9-8ad1-3991121f1d67-cache\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.440020 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbpxp\" (UniqueName: \"kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-kube-api-access-tbpxp\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.440977 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2c9nb" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.441992 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2c9nb" event={"ID":"579d7285-3560-4795-8a38-516fd67df1f4","Type":"ContainerDied","Data":"d034034165a7386ce92703a928c08396373b518a22f5900b9db2b49f6d2ce96e"} Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.442023 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d034034165a7386ce92703a928c08396373b518a22f5900b9db2b49f6d2ce96e" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.442815 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.444105 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2snzz" event={"ID":"e8c3184f-c96d-4339-8d4c-af9b3fa9a04d","Type":"ContainerDied","Data":"f4586fb510c6d9fa4f31256eaed290f599a8eb4d7090e8a4c6a20d08a5121e81"} Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.444248 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4586fb510c6d9fa4f31256eaed290f599a8eb4d7090e8a4c6a20d08a5121e81" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.444393 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2snzz" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.445675 4563 generic.go:334] "Generic (PLEG): container finished" podID="46649dc4-4337-4378-a0a1-70b329141a22" containerID="e3ecef361d60e4158ee904d8c75861ca04dd27652ec80cabafd5896cf0859914" exitCode=0 Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.445857 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" event={"ID":"46649dc4-4337-4378-a0a1-70b329141a22","Type":"ContainerDied","Data":"e3ecef361d60e4158ee904d8c75861ca04dd27652ec80cabafd5896cf0859914"} Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.445909 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" event={"ID":"46649dc4-4337-4378-a0a1-70b329141a22","Type":"ContainerStarted","Data":"96e2d1582694ad6ebdf3eb01743977005afb8a7a84eeea0d1bac2fdbc1d90368"} Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.448715 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2820-account-create-rzsg6" event={"ID":"6072ef52-6143-434b-b56f-06e3d11c966f","Type":"ContainerDied","Data":"e8b80f57bce9e05d11678b69f3350bead58ce88a364921af87df01d9672e3d70"} Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.448762 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8b80f57bce9e05d11678b69f3350bead58ce88a364921af87df01d9672e3d70" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.448763 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2820-account-create-rzsg6" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.451286 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-aa19-account-create-bp4rh" event={"ID":"b6005814-b899-4e79-816e-c51ffbe41a91","Type":"ContainerDied","Data":"4d1fb504fa520828ca839e2a6b0e6df26c05591ed65a1e2c06b764c72b40c468"} Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.451381 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d1fb504fa520828ca839e2a6b0e6df26c05591ed65a1e2c06b764c72b40c468" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.451500 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-aa19-account-create-bp4rh" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.662679 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mgbkw"] Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.664076 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.665744 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.666988 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.667320 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.671459 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mgbkw"] Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.831138 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-combined-ca-bundle\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.831203 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc4k7\" (UniqueName: \"kubernetes.io/projected/874b4a65-f3cc-4bb7-9634-0a464700f823-kube-api-access-lc4k7\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.831336 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/874b4a65-f3cc-4bb7-9634-0a464700f823-ring-data-devices\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.831393 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-swiftconf\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.831718 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/874b4a65-f3cc-4bb7-9634-0a464700f823-etc-swift\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.831899 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-dispersionconf\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.832973 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/874b4a65-f3cc-4bb7-9634-0a464700f823-scripts\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.935020 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/874b4a65-f3cc-4bb7-9634-0a464700f823-etc-swift\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.935072 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.935094 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-dispersionconf\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.935142 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/874b4a65-f3cc-4bb7-9634-0a464700f823-scripts\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.935175 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-combined-ca-bundle\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.935190 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc4k7\" (UniqueName: \"kubernetes.io/projected/874b4a65-f3cc-4bb7-9634-0a464700f823-kube-api-access-lc4k7\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.935225 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/874b4a65-f3cc-4bb7-9634-0a464700f823-ring-data-devices\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.935252 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-swiftconf\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: E1124 09:17:49.935291 4563 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 24 09:17:49 crc kubenswrapper[4563]: E1124 09:17:49.935314 4563 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 24 09:17:49 crc kubenswrapper[4563]: E1124 09:17:49.935366 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift podName:9d44fee0-139c-42c9-8ad1-3991121f1d67 nodeName:}" failed. No retries permitted until 2025-11-24 09:17:50.935351815 +0000 UTC m=+848.194329263 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift") pod "swift-storage-0" (UID: "9d44fee0-139c-42c9-8ad1-3991121f1d67") : configmap "swift-ring-files" not found Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.935559 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/874b4a65-f3cc-4bb7-9634-0a464700f823-etc-swift\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.936110 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/874b4a65-f3cc-4bb7-9634-0a464700f823-ring-data-devices\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.936507 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/874b4a65-f3cc-4bb7-9634-0a464700f823-scripts\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.940441 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-combined-ca-bundle\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.940497 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-swiftconf\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.940728 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-dispersionconf\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.950609 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc4k7\" (UniqueName: \"kubernetes.io/projected/874b4a65-f3cc-4bb7-9634-0a464700f823-kube-api-access-lc4k7\") pod \"swift-ring-rebalance-mgbkw\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:49 crc kubenswrapper[4563]: I1124 09:17:49.976555 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:17:50 crc kubenswrapper[4563]: I1124 09:17:50.353654 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mgbkw"] Nov 24 09:17:50 crc kubenswrapper[4563]: W1124 09:17:50.356875 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod874b4a65_f3cc_4bb7_9634_0a464700f823.slice/crio-04caea63bb43b5ce5bcebfe52c455055248cbe8c7b067dac0550407115539028 WatchSource:0}: Error finding container 04caea63bb43b5ce5bcebfe52c455055248cbe8c7b067dac0550407115539028: Status 404 returned error can't find the container with id 04caea63bb43b5ce5bcebfe52c455055248cbe8c7b067dac0550407115539028 Nov 24 09:17:50 crc kubenswrapper[4563]: I1124 09:17:50.465822 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" event={"ID":"46649dc4-4337-4378-a0a1-70b329141a22","Type":"ContainerStarted","Data":"e7636c149a4b5bd1098aa169a69ea8d47f767d6a95e9db54cf48fdd3ebc2b10b"} Nov 24 09:17:50 crc kubenswrapper[4563]: I1124 09:17:50.466140 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:50 crc kubenswrapper[4563]: I1124 09:17:50.467350 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mgbkw" event={"ID":"874b4a65-f3cc-4bb7-9634-0a464700f823","Type":"ContainerStarted","Data":"04caea63bb43b5ce5bcebfe52c455055248cbe8c7b067dac0550407115539028"} Nov 24 09:17:50 crc kubenswrapper[4563]: I1124 09:17:50.485513 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" podStartSLOduration=3.48549459 podStartE2EDuration="3.48549459s" podCreationTimestamp="2025-11-24 09:17:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:17:50.481166956 +0000 UTC m=+847.740144413" watchObservedRunningTime="2025-11-24 09:17:50.48549459 +0000 UTC m=+847.744472036" Nov 24 09:17:50 crc kubenswrapper[4563]: I1124 09:17:50.960173 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:50 crc kubenswrapper[4563]: E1124 09:17:50.960688 4563 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 24 09:17:50 crc kubenswrapper[4563]: E1124 09:17:50.960835 4563 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 24 09:17:50 crc kubenswrapper[4563]: E1124 09:17:50.960908 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift podName:9d44fee0-139c-42c9-8ad1-3991121f1d67 nodeName:}" failed. No retries permitted until 2025-11-24 09:17:52.960885932 +0000 UTC m=+850.219863379 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift") pod "swift-storage-0" (UID: "9d44fee0-139c-42c9-8ad1-3991121f1d67") : configmap "swift-ring-files" not found Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.434332 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-j4vjc"] Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.435534 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-j4vjc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.451744 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-j4vjc"] Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.458489 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9dd3-account-create-wwmtc"] Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.459484 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9dd3-account-create-wwmtc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.461052 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.468938 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4df8208-ad24-49df-bd0b-dfb181a9269e-operator-scripts\") pod \"glance-9dd3-account-create-wwmtc\" (UID: \"f4df8208-ad24-49df-bd0b-dfb181a9269e\") " pod="openstack/glance-9dd3-account-create-wwmtc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.469008 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcp94\" (UniqueName: \"kubernetes.io/projected/411e07f4-60ab-4c24-835e-8c677e121702-kube-api-access-qcp94\") pod \"glance-db-create-j4vjc\" (UID: \"411e07f4-60ab-4c24-835e-8c677e121702\") " pod="openstack/glance-db-create-j4vjc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.469054 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7j6n\" (UniqueName: \"kubernetes.io/projected/f4df8208-ad24-49df-bd0b-dfb181a9269e-kube-api-access-l7j6n\") pod \"glance-9dd3-account-create-wwmtc\" (UID: \"f4df8208-ad24-49df-bd0b-dfb181a9269e\") " pod="openstack/glance-9dd3-account-create-wwmtc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.469215 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411e07f4-60ab-4c24-835e-8c677e121702-operator-scripts\") pod \"glance-db-create-j4vjc\" (UID: \"411e07f4-60ab-4c24-835e-8c677e121702\") " pod="openstack/glance-db-create-j4vjc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.479511 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9dd3-account-create-wwmtc"] Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.571089 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411e07f4-60ab-4c24-835e-8c677e121702-operator-scripts\") pod \"glance-db-create-j4vjc\" (UID: \"411e07f4-60ab-4c24-835e-8c677e121702\") " pod="openstack/glance-db-create-j4vjc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.571178 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4df8208-ad24-49df-bd0b-dfb181a9269e-operator-scripts\") pod \"glance-9dd3-account-create-wwmtc\" (UID: \"f4df8208-ad24-49df-bd0b-dfb181a9269e\") " pod="openstack/glance-9dd3-account-create-wwmtc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.571230 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcp94\" (UniqueName: \"kubernetes.io/projected/411e07f4-60ab-4c24-835e-8c677e121702-kube-api-access-qcp94\") pod \"glance-db-create-j4vjc\" (UID: \"411e07f4-60ab-4c24-835e-8c677e121702\") " pod="openstack/glance-db-create-j4vjc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.571300 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7j6n\" (UniqueName: \"kubernetes.io/projected/f4df8208-ad24-49df-bd0b-dfb181a9269e-kube-api-access-l7j6n\") pod \"glance-9dd3-account-create-wwmtc\" (UID: \"f4df8208-ad24-49df-bd0b-dfb181a9269e\") " pod="openstack/glance-9dd3-account-create-wwmtc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.572440 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411e07f4-60ab-4c24-835e-8c677e121702-operator-scripts\") pod \"glance-db-create-j4vjc\" (UID: \"411e07f4-60ab-4c24-835e-8c677e121702\") " pod="openstack/glance-db-create-j4vjc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.573806 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4df8208-ad24-49df-bd0b-dfb181a9269e-operator-scripts\") pod \"glance-9dd3-account-create-wwmtc\" (UID: \"f4df8208-ad24-49df-bd0b-dfb181a9269e\") " pod="openstack/glance-9dd3-account-create-wwmtc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.589869 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcp94\" (UniqueName: \"kubernetes.io/projected/411e07f4-60ab-4c24-835e-8c677e121702-kube-api-access-qcp94\") pod \"glance-db-create-j4vjc\" (UID: \"411e07f4-60ab-4c24-835e-8c677e121702\") " pod="openstack/glance-db-create-j4vjc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.600792 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7j6n\" (UniqueName: \"kubernetes.io/projected/f4df8208-ad24-49df-bd0b-dfb181a9269e-kube-api-access-l7j6n\") pod \"glance-9dd3-account-create-wwmtc\" (UID: \"f4df8208-ad24-49df-bd0b-dfb181a9269e\") " pod="openstack/glance-9dd3-account-create-wwmtc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.754153 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-j4vjc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.774648 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9dd3-account-create-wwmtc" Nov 24 09:17:51 crc kubenswrapper[4563]: I1124 09:17:51.901825 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:52 crc kubenswrapper[4563]: I1124 09:17:52.994043 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:52 crc kubenswrapper[4563]: E1124 09:17:52.994237 4563 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 24 09:17:52 crc kubenswrapper[4563]: E1124 09:17:52.994290 4563 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 24 09:17:52 crc kubenswrapper[4563]: E1124 09:17:52.994345 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift podName:9d44fee0-139c-42c9-8ad1-3991121f1d67 nodeName:}" failed. No retries permitted until 2025-11-24 09:17:56.99432925 +0000 UTC m=+854.253306698 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift") pod "swift-storage-0" (UID: "9d44fee0-139c-42c9-8ad1-3991121f1d67") : configmap "swift-ring-files" not found Nov 24 09:17:53 crc kubenswrapper[4563]: I1124 09:17:53.427278 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9dd3-account-create-wwmtc"] Nov 24 09:17:53 crc kubenswrapper[4563]: I1124 09:17:53.485697 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-j4vjc"] Nov 24 09:17:53 crc kubenswrapper[4563]: W1124 09:17:53.486047 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod411e07f4_60ab_4c24_835e_8c677e121702.slice/crio-70e0cdd9edbc9f871aa89dc97b598f86985825c3b14a6a3bb3b99e58501fa69d WatchSource:0}: Error finding container 70e0cdd9edbc9f871aa89dc97b598f86985825c3b14a6a3bb3b99e58501fa69d: Status 404 returned error can't find the container with id 70e0cdd9edbc9f871aa89dc97b598f86985825c3b14a6a3bb3b99e58501fa69d Nov 24 09:17:53 crc kubenswrapper[4563]: I1124 09:17:53.498092 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mgbkw" event={"ID":"874b4a65-f3cc-4bb7-9634-0a464700f823","Type":"ContainerStarted","Data":"c244f59ee8e692f77620cd1548ee3164048f5aa54ad92a9312d80dce95f6e9c2"} Nov 24 09:17:53 crc kubenswrapper[4563]: I1124 09:17:53.499381 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-j4vjc" event={"ID":"411e07f4-60ab-4c24-835e-8c677e121702","Type":"ContainerStarted","Data":"70e0cdd9edbc9f871aa89dc97b598f86985825c3b14a6a3bb3b99e58501fa69d"} Nov 24 09:17:53 crc kubenswrapper[4563]: I1124 09:17:53.500511 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9dd3-account-create-wwmtc" event={"ID":"f4df8208-ad24-49df-bd0b-dfb181a9269e","Type":"ContainerStarted","Data":"0b9d3cbe920e8f5dffdefbd4828a37f9bcb75ef4f3dc54a1a44e4786c76d3ffa"} Nov 24 09:17:53 crc kubenswrapper[4563]: I1124 09:17:53.520425 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mgbkw" podStartSLOduration=1.832125355 podStartE2EDuration="4.520410798s" podCreationTimestamp="2025-11-24 09:17:49 +0000 UTC" firstStartedPulling="2025-11-24 09:17:50.358899217 +0000 UTC m=+847.617876664" lastFinishedPulling="2025-11-24 09:17:53.04718466 +0000 UTC m=+850.306162107" observedRunningTime="2025-11-24 09:17:53.514399319 +0000 UTC m=+850.773376766" watchObservedRunningTime="2025-11-24 09:17:53.520410798 +0000 UTC m=+850.779388245" Nov 24 09:17:54 crc kubenswrapper[4563]: I1124 09:17:54.510065 4563 generic.go:334] "Generic (PLEG): container finished" podID="411e07f4-60ab-4c24-835e-8c677e121702" containerID="d5da6cab5264481835c5f9fa0d133666f92336aa7a6601d942d575db8d3bd9d4" exitCode=0 Nov 24 09:17:54 crc kubenswrapper[4563]: I1124 09:17:54.510408 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-j4vjc" event={"ID":"411e07f4-60ab-4c24-835e-8c677e121702","Type":"ContainerDied","Data":"d5da6cab5264481835c5f9fa0d133666f92336aa7a6601d942d575db8d3bd9d4"} Nov 24 09:17:54 crc kubenswrapper[4563]: I1124 09:17:54.512499 4563 generic.go:334] "Generic (PLEG): container finished" podID="f4df8208-ad24-49df-bd0b-dfb181a9269e" containerID="c8e054827b3388db0230a41a37b488fbd7caa049e0c9f4174587f47878f6e3a4" exitCode=0 Nov 24 09:17:54 crc kubenswrapper[4563]: I1124 09:17:54.512794 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9dd3-account-create-wwmtc" event={"ID":"f4df8208-ad24-49df-bd0b-dfb181a9269e","Type":"ContainerDied","Data":"c8e054827b3388db0230a41a37b488fbd7caa049e0c9f4174587f47878f6e3a4"} Nov 24 09:17:55 crc kubenswrapper[4563]: I1124 09:17:55.834147 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-j4vjc" Nov 24 09:17:55 crc kubenswrapper[4563]: I1124 09:17:55.838840 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9dd3-account-create-wwmtc" Nov 24 09:17:55 crc kubenswrapper[4563]: I1124 09:17:55.860141 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7j6n\" (UniqueName: \"kubernetes.io/projected/f4df8208-ad24-49df-bd0b-dfb181a9269e-kube-api-access-l7j6n\") pod \"f4df8208-ad24-49df-bd0b-dfb181a9269e\" (UID: \"f4df8208-ad24-49df-bd0b-dfb181a9269e\") " Nov 24 09:17:55 crc kubenswrapper[4563]: I1124 09:17:55.860439 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4df8208-ad24-49df-bd0b-dfb181a9269e-operator-scripts\") pod \"f4df8208-ad24-49df-bd0b-dfb181a9269e\" (UID: \"f4df8208-ad24-49df-bd0b-dfb181a9269e\") " Nov 24 09:17:55 crc kubenswrapper[4563]: I1124 09:17:55.860527 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411e07f4-60ab-4c24-835e-8c677e121702-operator-scripts\") pod \"411e07f4-60ab-4c24-835e-8c677e121702\" (UID: \"411e07f4-60ab-4c24-835e-8c677e121702\") " Nov 24 09:17:55 crc kubenswrapper[4563]: I1124 09:17:55.861394 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcp94\" (UniqueName: \"kubernetes.io/projected/411e07f4-60ab-4c24-835e-8c677e121702-kube-api-access-qcp94\") pod \"411e07f4-60ab-4c24-835e-8c677e121702\" (UID: \"411e07f4-60ab-4c24-835e-8c677e121702\") " Nov 24 09:17:55 crc kubenswrapper[4563]: I1124 09:17:55.861320 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4df8208-ad24-49df-bd0b-dfb181a9269e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4df8208-ad24-49df-bd0b-dfb181a9269e" (UID: "f4df8208-ad24-49df-bd0b-dfb181a9269e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:55 crc kubenswrapper[4563]: I1124 09:17:55.861368 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/411e07f4-60ab-4c24-835e-8c677e121702-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "411e07f4-60ab-4c24-835e-8c677e121702" (UID: "411e07f4-60ab-4c24-835e-8c677e121702"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:55 crc kubenswrapper[4563]: I1124 09:17:55.862083 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4df8208-ad24-49df-bd0b-dfb181a9269e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:55 crc kubenswrapper[4563]: I1124 09:17:55.862160 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411e07f4-60ab-4c24-835e-8c677e121702-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:55 crc kubenswrapper[4563]: I1124 09:17:55.865755 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4df8208-ad24-49df-bd0b-dfb181a9269e-kube-api-access-l7j6n" (OuterVolumeSpecName: "kube-api-access-l7j6n") pod "f4df8208-ad24-49df-bd0b-dfb181a9269e" (UID: "f4df8208-ad24-49df-bd0b-dfb181a9269e"). InnerVolumeSpecName "kube-api-access-l7j6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:17:55 crc kubenswrapper[4563]: I1124 09:17:55.865798 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/411e07f4-60ab-4c24-835e-8c677e121702-kube-api-access-qcp94" (OuterVolumeSpecName: "kube-api-access-qcp94") pod "411e07f4-60ab-4c24-835e-8c677e121702" (UID: "411e07f4-60ab-4c24-835e-8c677e121702"). InnerVolumeSpecName "kube-api-access-qcp94". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:17:55 crc kubenswrapper[4563]: I1124 09:17:55.964094 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcp94\" (UniqueName: \"kubernetes.io/projected/411e07f4-60ab-4c24-835e-8c677e121702-kube-api-access-qcp94\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:55 crc kubenswrapper[4563]: I1124 09:17:55.964132 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7j6n\" (UniqueName: \"kubernetes.io/projected/f4df8208-ad24-49df-bd0b-dfb181a9269e-kube-api-access-l7j6n\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:56 crc kubenswrapper[4563]: I1124 09:17:56.531024 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-j4vjc" event={"ID":"411e07f4-60ab-4c24-835e-8c677e121702","Type":"ContainerDied","Data":"70e0cdd9edbc9f871aa89dc97b598f86985825c3b14a6a3bb3b99e58501fa69d"} Nov 24 09:17:56 crc kubenswrapper[4563]: I1124 09:17:56.531385 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70e0cdd9edbc9f871aa89dc97b598f86985825c3b14a6a3bb3b99e58501fa69d" Nov 24 09:17:56 crc kubenswrapper[4563]: I1124 09:17:56.531085 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-j4vjc" Nov 24 09:17:56 crc kubenswrapper[4563]: I1124 09:17:56.542450 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9dd3-account-create-wwmtc" event={"ID":"f4df8208-ad24-49df-bd0b-dfb181a9269e","Type":"ContainerDied","Data":"0b9d3cbe920e8f5dffdefbd4828a37f9bcb75ef4f3dc54a1a44e4786c76d3ffa"} Nov 24 09:17:56 crc kubenswrapper[4563]: I1124 09:17:56.542502 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b9d3cbe920e8f5dffdefbd4828a37f9bcb75ef4f3dc54a1a44e4786c76d3ffa" Nov 24 09:17:56 crc kubenswrapper[4563]: I1124 09:17:56.542570 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9dd3-account-create-wwmtc" Nov 24 09:17:57 crc kubenswrapper[4563]: I1124 09:17:57.082675 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:17:57 crc kubenswrapper[4563]: E1124 09:17:57.083130 4563 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 24 09:17:57 crc kubenswrapper[4563]: E1124 09:17:57.083156 4563 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 24 09:17:57 crc kubenswrapper[4563]: E1124 09:17:57.083228 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift podName:9d44fee0-139c-42c9-8ad1-3991121f1d67 nodeName:}" failed. No retries permitted until 2025-11-24 09:18:05.083204062 +0000 UTC m=+862.342181510 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift") pod "swift-storage-0" (UID: "9d44fee0-139c-42c9-8ad1-3991121f1d67") : configmap "swift-ring-files" not found Nov 24 09:17:57 crc kubenswrapper[4563]: I1124 09:17:57.358095 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 24 09:17:58 crc kubenswrapper[4563]: I1124 09:17:58.290800 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:17:58 crc kubenswrapper[4563]: I1124 09:17:58.371980 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c9b8d4f7-7gcks"] Nov 24 09:17:58 crc kubenswrapper[4563]: I1124 09:17:58.372241 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" podUID="af3bd664-81f7-40fa-a654-deaefa63e7e0" containerName="dnsmasq-dns" containerID="cri-o://b6ab5a048f3cb70c5fe7d3e8b44e2006f60e00792e629c8fea25386835a6c91a" gracePeriod=10 Nov 24 09:17:58 crc kubenswrapper[4563]: I1124 09:17:58.558653 4563 generic.go:334] "Generic (PLEG): container finished" podID="af3bd664-81f7-40fa-a654-deaefa63e7e0" containerID="b6ab5a048f3cb70c5fe7d3e8b44e2006f60e00792e629c8fea25386835a6c91a" exitCode=0 Nov 24 09:17:58 crc kubenswrapper[4563]: I1124 09:17:58.558696 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" event={"ID":"af3bd664-81f7-40fa-a654-deaefa63e7e0","Type":"ContainerDied","Data":"b6ab5a048f3cb70c5fe7d3e8b44e2006f60e00792e629c8fea25386835a6c91a"} Nov 24 09:17:58 crc kubenswrapper[4563]: I1124 09:17:58.823977 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:58 crc kubenswrapper[4563]: I1124 09:17:58.916340 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-ovsdbserver-sb\") pod \"af3bd664-81f7-40fa-a654-deaefa63e7e0\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " Nov 24 09:17:58 crc kubenswrapper[4563]: I1124 09:17:58.916541 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-config\") pod \"af3bd664-81f7-40fa-a654-deaefa63e7e0\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " Nov 24 09:17:58 crc kubenswrapper[4563]: I1124 09:17:58.916663 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gjkg\" (UniqueName: \"kubernetes.io/projected/af3bd664-81f7-40fa-a654-deaefa63e7e0-kube-api-access-7gjkg\") pod \"af3bd664-81f7-40fa-a654-deaefa63e7e0\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " Nov 24 09:17:58 crc kubenswrapper[4563]: I1124 09:17:58.916798 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-dns-svc\") pod \"af3bd664-81f7-40fa-a654-deaefa63e7e0\" (UID: \"af3bd664-81f7-40fa-a654-deaefa63e7e0\") " Nov 24 09:17:58 crc kubenswrapper[4563]: I1124 09:17:58.922012 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3bd664-81f7-40fa-a654-deaefa63e7e0-kube-api-access-7gjkg" (OuterVolumeSpecName: "kube-api-access-7gjkg") pod "af3bd664-81f7-40fa-a654-deaefa63e7e0" (UID: "af3bd664-81f7-40fa-a654-deaefa63e7e0"). InnerVolumeSpecName "kube-api-access-7gjkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:17:58 crc kubenswrapper[4563]: I1124 09:17:58.950147 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af3bd664-81f7-40fa-a654-deaefa63e7e0" (UID: "af3bd664-81f7-40fa-a654-deaefa63e7e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:58 crc kubenswrapper[4563]: I1124 09:17:58.952202 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-config" (OuterVolumeSpecName: "config") pod "af3bd664-81f7-40fa-a654-deaefa63e7e0" (UID: "af3bd664-81f7-40fa-a654-deaefa63e7e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:58 crc kubenswrapper[4563]: I1124 09:17:58.954407 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "af3bd664-81f7-40fa-a654-deaefa63e7e0" (UID: "af3bd664-81f7-40fa-a654-deaefa63e7e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:17:59 crc kubenswrapper[4563]: I1124 09:17:59.019115 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:59 crc kubenswrapper[4563]: I1124 09:17:59.019155 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:59 crc kubenswrapper[4563]: I1124 09:17:59.019168 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gjkg\" (UniqueName: \"kubernetes.io/projected/af3bd664-81f7-40fa-a654-deaefa63e7e0-kube-api-access-7gjkg\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:59 crc kubenswrapper[4563]: I1124 09:17:59.019180 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af3bd664-81f7-40fa-a654-deaefa63e7e0-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:17:59 crc kubenswrapper[4563]: I1124 09:17:59.571997 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" event={"ID":"af3bd664-81f7-40fa-a654-deaefa63e7e0","Type":"ContainerDied","Data":"860025f603a81ab7984c14231fe834ef16b35885b8df8acc1eee19169a109400"} Nov 24 09:17:59 crc kubenswrapper[4563]: I1124 09:17:59.572059 4563 scope.go:117] "RemoveContainer" containerID="b6ab5a048f3cb70c5fe7d3e8b44e2006f60e00792e629c8fea25386835a6c91a" Nov 24 09:17:59 crc kubenswrapper[4563]: I1124 09:17:59.572054 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c9b8d4f7-7gcks" Nov 24 09:17:59 crc kubenswrapper[4563]: I1124 09:17:59.574566 4563 generic.go:334] "Generic (PLEG): container finished" podID="874b4a65-f3cc-4bb7-9634-0a464700f823" containerID="c244f59ee8e692f77620cd1548ee3164048f5aa54ad92a9312d80dce95f6e9c2" exitCode=0 Nov 24 09:17:59 crc kubenswrapper[4563]: I1124 09:17:59.574611 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mgbkw" event={"ID":"874b4a65-f3cc-4bb7-9634-0a464700f823","Type":"ContainerDied","Data":"c244f59ee8e692f77620cd1548ee3164048f5aa54ad92a9312d80dce95f6e9c2"} Nov 24 09:17:59 crc kubenswrapper[4563]: I1124 09:17:59.594264 4563 scope.go:117] "RemoveContainer" containerID="7a97bf4e7620f71590aada0c3984fcc01da8826c9eec7b69f8bbc90f0b40308e" Nov 24 09:17:59 crc kubenswrapper[4563]: I1124 09:17:59.607753 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c9b8d4f7-7gcks"] Nov 24 09:17:59 crc kubenswrapper[4563]: I1124 09:17:59.611493 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65c9b8d4f7-7gcks"] Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.886163 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.946967 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-swiftconf\") pod \"874b4a65-f3cc-4bb7-9634-0a464700f823\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.947056 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/874b4a65-f3cc-4bb7-9634-0a464700f823-etc-swift\") pod \"874b4a65-f3cc-4bb7-9634-0a464700f823\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.947107 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/874b4a65-f3cc-4bb7-9634-0a464700f823-scripts\") pod \"874b4a65-f3cc-4bb7-9634-0a464700f823\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.947131 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-combined-ca-bundle\") pod \"874b4a65-f3cc-4bb7-9634-0a464700f823\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.947151 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/874b4a65-f3cc-4bb7-9634-0a464700f823-ring-data-devices\") pod \"874b4a65-f3cc-4bb7-9634-0a464700f823\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.947194 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc4k7\" (UniqueName: \"kubernetes.io/projected/874b4a65-f3cc-4bb7-9634-0a464700f823-kube-api-access-lc4k7\") pod \"874b4a65-f3cc-4bb7-9634-0a464700f823\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.947233 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-dispersionconf\") pod \"874b4a65-f3cc-4bb7-9634-0a464700f823\" (UID: \"874b4a65-f3cc-4bb7-9634-0a464700f823\") " Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.948530 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/874b4a65-f3cc-4bb7-9634-0a464700f823-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "874b4a65-f3cc-4bb7-9634-0a464700f823" (UID: "874b4a65-f3cc-4bb7-9634-0a464700f823"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.948619 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/874b4a65-f3cc-4bb7-9634-0a464700f823-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "874b4a65-f3cc-4bb7-9634-0a464700f823" (UID: "874b4a65-f3cc-4bb7-9634-0a464700f823"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.951430 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/874b4a65-f3cc-4bb7-9634-0a464700f823-kube-api-access-lc4k7" (OuterVolumeSpecName: "kube-api-access-lc4k7") pod "874b4a65-f3cc-4bb7-9634-0a464700f823" (UID: "874b4a65-f3cc-4bb7-9634-0a464700f823"). InnerVolumeSpecName "kube-api-access-lc4k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.962030 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/874b4a65-f3cc-4bb7-9634-0a464700f823-scripts" (OuterVolumeSpecName: "scripts") pod "874b4a65-f3cc-4bb7-9634-0a464700f823" (UID: "874b4a65-f3cc-4bb7-9634-0a464700f823"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.963222 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "874b4a65-f3cc-4bb7-9634-0a464700f823" (UID: "874b4a65-f3cc-4bb7-9634-0a464700f823"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.965141 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "874b4a65-f3cc-4bb7-9634-0a464700f823" (UID: "874b4a65-f3cc-4bb7-9634-0a464700f823"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:00 crc kubenswrapper[4563]: I1124 09:18:00.969205 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "874b4a65-f3cc-4bb7-9634-0a464700f823" (UID: "874b4a65-f3cc-4bb7-9634-0a464700f823"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.049034 4563 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.049068 4563 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/874b4a65-f3cc-4bb7-9634-0a464700f823-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.049080 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/874b4a65-f3cc-4bb7-9634-0a464700f823-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.049090 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.049103 4563 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/874b4a65-f3cc-4bb7-9634-0a464700f823-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.049111 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc4k7\" (UniqueName: \"kubernetes.io/projected/874b4a65-f3cc-4bb7-9634-0a464700f823-kube-api-access-lc4k7\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.049121 4563 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/874b4a65-f3cc-4bb7-9634-0a464700f823-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.066488 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3bd664-81f7-40fa-a654-deaefa63e7e0" path="/var/lib/kubelet/pods/af3bd664-81f7-40fa-a654-deaefa63e7e0/volumes" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.590066 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mgbkw" event={"ID":"874b4a65-f3cc-4bb7-9634-0a464700f823","Type":"ContainerDied","Data":"04caea63bb43b5ce5bcebfe52c455055248cbe8c7b067dac0550407115539028"} Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.590104 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04caea63bb43b5ce5bcebfe52c455055248cbe8c7b067dac0550407115539028" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.590154 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mgbkw" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.592234 4563 generic.go:334] "Generic (PLEG): container finished" podID="18ec698b-354c-4d4e-9126-16c493474617" containerID="92de487af76ff9936312b95bbfc2bef78c4d67ca7362338d5d1ace860caa89c0" exitCode=0 Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.592310 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"18ec698b-354c-4d4e-9126-16c493474617","Type":"ContainerDied","Data":"92de487af76ff9936312b95bbfc2bef78c4d67ca7362338d5d1ace860caa89c0"} Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.593679 4563 generic.go:334] "Generic (PLEG): container finished" podID="e4286a17-bf24-4c91-91cb-6e3f3d731d24" containerID="73d417eaecd31e6125b6cbd867865adffe9c7cd078cefc1bbdf0d4e6e9eec4e1" exitCode=0 Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.593717 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e4286a17-bf24-4c91-91cb-6e3f3d731d24","Type":"ContainerDied","Data":"73d417eaecd31e6125b6cbd867865adffe9c7cd078cefc1bbdf0d4e6e9eec4e1"} Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.629072 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-6msdh"] Nov 24 09:18:01 crc kubenswrapper[4563]: E1124 09:18:01.629921 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3bd664-81f7-40fa-a654-deaefa63e7e0" containerName="dnsmasq-dns" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.629942 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3bd664-81f7-40fa-a654-deaefa63e7e0" containerName="dnsmasq-dns" Nov 24 09:18:01 crc kubenswrapper[4563]: E1124 09:18:01.629965 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411e07f4-60ab-4c24-835e-8c677e121702" containerName="mariadb-database-create" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.629972 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="411e07f4-60ab-4c24-835e-8c677e121702" containerName="mariadb-database-create" Nov 24 09:18:01 crc kubenswrapper[4563]: E1124 09:18:01.629986 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="874b4a65-f3cc-4bb7-9634-0a464700f823" containerName="swift-ring-rebalance" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.629991 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="874b4a65-f3cc-4bb7-9634-0a464700f823" containerName="swift-ring-rebalance" Nov 24 09:18:01 crc kubenswrapper[4563]: E1124 09:18:01.630005 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3bd664-81f7-40fa-a654-deaefa63e7e0" containerName="init" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.630010 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3bd664-81f7-40fa-a654-deaefa63e7e0" containerName="init" Nov 24 09:18:01 crc kubenswrapper[4563]: E1124 09:18:01.630021 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4df8208-ad24-49df-bd0b-dfb181a9269e" containerName="mariadb-account-create" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.630028 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4df8208-ad24-49df-bd0b-dfb181a9269e" containerName="mariadb-account-create" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.630213 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="411e07f4-60ab-4c24-835e-8c677e121702" containerName="mariadb-database-create" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.630225 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3bd664-81f7-40fa-a654-deaefa63e7e0" containerName="dnsmasq-dns" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.630236 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4df8208-ad24-49df-bd0b-dfb181a9269e" containerName="mariadb-account-create" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.630246 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="874b4a65-f3cc-4bb7-9634-0a464700f823" containerName="swift-ring-rebalance" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.630815 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.632964 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.634113 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-c5b44" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.642721 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6msdh"] Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.766477 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-combined-ca-bundle\") pod \"glance-db-sync-6msdh\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.766593 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-db-sync-config-data\") pod \"glance-db-sync-6msdh\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.766816 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf6wh\" (UniqueName: \"kubernetes.io/projected/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-kube-api-access-hf6wh\") pod \"glance-db-sync-6msdh\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.766850 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-config-data\") pod \"glance-db-sync-6msdh\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.868678 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-config-data\") pod \"glance-db-sync-6msdh\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.868795 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-combined-ca-bundle\") pod \"glance-db-sync-6msdh\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.868847 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-db-sync-config-data\") pod \"glance-db-sync-6msdh\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.868927 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf6wh\" (UniqueName: \"kubernetes.io/projected/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-kube-api-access-hf6wh\") pod \"glance-db-sync-6msdh\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.872015 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-db-sync-config-data\") pod \"glance-db-sync-6msdh\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.872272 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-config-data\") pod \"glance-db-sync-6msdh\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.872790 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-combined-ca-bundle\") pod \"glance-db-sync-6msdh\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.881587 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf6wh\" (UniqueName: \"kubernetes.io/projected/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-kube-api-access-hf6wh\") pod \"glance-db-sync-6msdh\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:01 crc kubenswrapper[4563]: I1124 09:18:01.970382 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:02 crc kubenswrapper[4563]: I1124 09:18:02.420462 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6msdh"] Nov 24 09:18:02 crc kubenswrapper[4563]: W1124 09:18:02.426622 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20286861_2552_4ff4_a5a1_3d67c9e0cb7b.slice/crio-c2e03d99ef699a09e98ef4f5039f0b9d3dc475b18d1422c5779806dcf5972bb4 WatchSource:0}: Error finding container c2e03d99ef699a09e98ef4f5039f0b9d3dc475b18d1422c5779806dcf5972bb4: Status 404 returned error can't find the container with id c2e03d99ef699a09e98ef4f5039f0b9d3dc475b18d1422c5779806dcf5972bb4 Nov 24 09:18:02 crc kubenswrapper[4563]: I1124 09:18:02.601937 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e4286a17-bf24-4c91-91cb-6e3f3d731d24","Type":"ContainerStarted","Data":"38156802d1ddfdcca5cd431c7fb277955aa2c8613d910c8ae419ae1320035f56"} Nov 24 09:18:02 crc kubenswrapper[4563]: I1124 09:18:02.602101 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:18:02 crc kubenswrapper[4563]: I1124 09:18:02.603657 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6msdh" event={"ID":"20286861-2552-4ff4-a5a1-3d67c9e0cb7b","Type":"ContainerStarted","Data":"c2e03d99ef699a09e98ef4f5039f0b9d3dc475b18d1422c5779806dcf5972bb4"} Nov 24 09:18:02 crc kubenswrapper[4563]: I1124 09:18:02.605790 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"18ec698b-354c-4d4e-9126-16c493474617","Type":"ContainerStarted","Data":"42642e2a9a6e213f467e573d7fa90a42a4dc0cb5e3ac48057c129287a5436410"} Nov 24 09:18:02 crc kubenswrapper[4563]: I1124 09:18:02.605994 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 24 09:18:02 crc kubenswrapper[4563]: I1124 09:18:02.618507 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.373603855 podStartE2EDuration="51.618495595s" podCreationTimestamp="2025-11-24 09:17:11 +0000 UTC" firstStartedPulling="2025-11-24 09:17:17.47207543 +0000 UTC m=+814.731052877" lastFinishedPulling="2025-11-24 09:17:28.716967169 +0000 UTC m=+825.975944617" observedRunningTime="2025-11-24 09:18:02.617145529 +0000 UTC m=+859.876122976" watchObservedRunningTime="2025-11-24 09:18:02.618495595 +0000 UTC m=+859.877473043" Nov 24 09:18:02 crc kubenswrapper[4563]: I1124 09:18:02.638317 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.480100577 podStartE2EDuration="51.63830981s" podCreationTimestamp="2025-11-24 09:17:11 +0000 UTC" firstStartedPulling="2025-11-24 09:17:13.561860779 +0000 UTC m=+810.820838226" lastFinishedPulling="2025-11-24 09:17:28.720070012 +0000 UTC m=+825.979047459" observedRunningTime="2025-11-24 09:18:02.633532769 +0000 UTC m=+859.892510216" watchObservedRunningTime="2025-11-24 09:18:02.63830981 +0000 UTC m=+859.897287257" Nov 24 09:18:05 crc kubenswrapper[4563]: I1124 09:18:05.127473 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:18:05 crc kubenswrapper[4563]: I1124 09:18:05.139515 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9d44fee0-139c-42c9-8ad1-3991121f1d67-etc-swift\") pod \"swift-storage-0\" (UID: \"9d44fee0-139c-42c9-8ad1-3991121f1d67\") " pod="openstack/swift-storage-0" Nov 24 09:18:05 crc kubenswrapper[4563]: I1124 09:18:05.358009 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 24 09:18:05 crc kubenswrapper[4563]: W1124 09:18:05.881224 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d44fee0_139c_42c9_8ad1_3991121f1d67.slice/crio-f0bfbcc04579318496e6ee773eb662bda26d60599b93b09ef48b983a098cf150 WatchSource:0}: Error finding container f0bfbcc04579318496e6ee773eb662bda26d60599b93b09ef48b983a098cf150: Status 404 returned error can't find the container with id f0bfbcc04579318496e6ee773eb662bda26d60599b93b09ef48b983a098cf150 Nov 24 09:18:05 crc kubenswrapper[4563]: I1124 09:18:05.896900 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 24 09:18:06 crc kubenswrapper[4563]: I1124 09:18:06.645020 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"f0bfbcc04579318496e6ee773eb662bda26d60599b93b09ef48b983a098cf150"} Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.182211 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qtfnl" podUID="241e854a-eb29-4933-98be-bad6b9295260" containerName="ovn-controller" probeResult="failure" output=< Nov 24 09:18:07 crc kubenswrapper[4563]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 24 09:18:07 crc kubenswrapper[4563]: > Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.193759 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.199309 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-6z24f" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.397707 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qtfnl-config-5w8bg"] Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.399071 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.400720 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.406988 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qtfnl-config-5w8bg"] Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.489856 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-run-ovn\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.489940 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-run\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.490068 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dc02b6bb-89e5-4fce-899c-d917e73312fe-additional-scripts\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.490256 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-log-ovn\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.490366 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc02b6bb-89e5-4fce-899c-d917e73312fe-scripts\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.490400 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vppd\" (UniqueName: \"kubernetes.io/projected/dc02b6bb-89e5-4fce-899c-d917e73312fe-kube-api-access-4vppd\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.591979 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-run\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.592029 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dc02b6bb-89e5-4fce-899c-d917e73312fe-additional-scripts\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.592062 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-log-ovn\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.592115 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc02b6bb-89e5-4fce-899c-d917e73312fe-scripts\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.592137 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vppd\" (UniqueName: \"kubernetes.io/projected/dc02b6bb-89e5-4fce-899c-d917e73312fe-kube-api-access-4vppd\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.592180 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-run-ovn\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.592444 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-run-ovn\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.592450 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-log-ovn\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.592455 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-run\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.593129 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dc02b6bb-89e5-4fce-899c-d917e73312fe-additional-scripts\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.595241 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc02b6bb-89e5-4fce-899c-d917e73312fe-scripts\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.622234 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vppd\" (UniqueName: \"kubernetes.io/projected/dc02b6bb-89e5-4fce-899c-d917e73312fe-kube-api-access-4vppd\") pod \"ovn-controller-qtfnl-config-5w8bg\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:07 crc kubenswrapper[4563]: I1124 09:18:07.731906 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:12 crc kubenswrapper[4563]: I1124 09:18:12.201773 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qtfnl" podUID="241e854a-eb29-4933-98be-bad6b9295260" containerName="ovn-controller" probeResult="failure" output=< Nov 24 09:18:12 crc kubenswrapper[4563]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 24 09:18:12 crc kubenswrapper[4563]: > Nov 24 09:18:12 crc kubenswrapper[4563]: I1124 09:18:12.445583 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qtfnl-config-5w8bg"] Nov 24 09:18:12 crc kubenswrapper[4563]: I1124 09:18:12.708929 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qtfnl-config-5w8bg" event={"ID":"dc02b6bb-89e5-4fce-899c-d917e73312fe","Type":"ContainerStarted","Data":"7fc5adf7e7dd334c85a9401b77c6e1ba3a7cc1d64970081f9bb4a7c460841bbf"} Nov 24 09:18:12 crc kubenswrapper[4563]: I1124 09:18:12.709254 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qtfnl-config-5w8bg" event={"ID":"dc02b6bb-89e5-4fce-899c-d917e73312fe","Type":"ContainerStarted","Data":"629e50b73c7d43f26413fe2f478444ad0a423de7284796b1720d4433ce07b8de"} Nov 24 09:18:12 crc kubenswrapper[4563]: I1124 09:18:12.710897 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6msdh" event={"ID":"20286861-2552-4ff4-a5a1-3d67c9e0cb7b","Type":"ContainerStarted","Data":"cf36b861866fe18ca83c30e811f07dfc6934b6b4d377c37c561572c96f7bfff6"} Nov 24 09:18:12 crc kubenswrapper[4563]: I1124 09:18:12.712920 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"661ee039693473c2e0a9f094d786cbc5be147655f714a45e6aefce5087a9d783"} Nov 24 09:18:12 crc kubenswrapper[4563]: I1124 09:18:12.712958 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"3b9b3e2a55e51f971fba079d5415aa1d5837bfd8ebe123730a545a25078bead2"} Nov 24 09:18:12 crc kubenswrapper[4563]: I1124 09:18:12.712969 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"da632169c85b600df61a054ffe977affda47761cb6267cb5bfc1ae8f27c7acc0"} Nov 24 09:18:12 crc kubenswrapper[4563]: I1124 09:18:12.712977 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"83132bba4493721b0a381fc91cae9b4a95bb1a4d6e96b190618daad4e968a15d"} Nov 24 09:18:12 crc kubenswrapper[4563]: I1124 09:18:12.724710 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-qtfnl-config-5w8bg" podStartSLOduration=5.724695182 podStartE2EDuration="5.724695182s" podCreationTimestamp="2025-11-24 09:18:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:12.721122543 +0000 UTC m=+869.980099990" watchObservedRunningTime="2025-11-24 09:18:12.724695182 +0000 UTC m=+869.983672629" Nov 24 09:18:12 crc kubenswrapper[4563]: I1124 09:18:12.731487 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-6msdh" podStartSLOduration=2.072222523 podStartE2EDuration="11.731473167s" podCreationTimestamp="2025-11-24 09:18:01 +0000 UTC" firstStartedPulling="2025-11-24 09:18:02.428955713 +0000 UTC m=+859.687933160" lastFinishedPulling="2025-11-24 09:18:12.088206357 +0000 UTC m=+869.347183804" observedRunningTime="2025-11-24 09:18:12.730886601 +0000 UTC m=+869.989864047" watchObservedRunningTime="2025-11-24 09:18:12.731473167 +0000 UTC m=+869.990450614" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.016834 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.247560 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-k9nfs"] Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.248501 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k9nfs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.256285 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-k9nfs"] Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.274784 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.348690 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-l585m"] Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.349670 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l585m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.366741 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-l585m"] Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.373430 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d52e-account-create-p7ns9"] Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.374438 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d52e-account-create-p7ns9" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.376657 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.387710 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d52e-account-create-p7ns9"] Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.396632 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8lz5\" (UniqueName: \"kubernetes.io/projected/4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550-kube-api-access-w8lz5\") pod \"cinder-db-create-k9nfs\" (UID: \"4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550\") " pod="openstack/cinder-db-create-k9nfs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.396700 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550-operator-scripts\") pod \"cinder-db-create-k9nfs\" (UID: \"4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550\") " pod="openstack/cinder-db-create-k9nfs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.452678 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c035-account-create-q6pbl"] Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.453632 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c035-account-create-q6pbl" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.456546 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.473013 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c035-account-create-q6pbl"] Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.499364 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fd85b6a-57be-47d8-955d-187926600e97-operator-scripts\") pod \"barbican-db-create-l585m\" (UID: \"4fd85b6a-57be-47d8-955d-187926600e97\") " pod="openstack/barbican-db-create-l585m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.499430 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550-operator-scripts\") pod \"cinder-db-create-k9nfs\" (UID: \"4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550\") " pod="openstack/cinder-db-create-k9nfs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.499466 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46v6s\" (UniqueName: \"kubernetes.io/projected/db3a0f98-33b9-450d-91e4-6575a60cfb2e-kube-api-access-46v6s\") pod \"barbican-d52e-account-create-p7ns9\" (UID: \"db3a0f98-33b9-450d-91e4-6575a60cfb2e\") " pod="openstack/barbican-d52e-account-create-p7ns9" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.499494 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79tpx\" (UniqueName: \"kubernetes.io/projected/a0e93314-6c63-4319-bc2d-7ef3c6b917ec-kube-api-access-79tpx\") pod \"cinder-c035-account-create-q6pbl\" (UID: \"a0e93314-6c63-4319-bc2d-7ef3c6b917ec\") " pod="openstack/cinder-c035-account-create-q6pbl" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.499577 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b56d6\" (UniqueName: \"kubernetes.io/projected/4fd85b6a-57be-47d8-955d-187926600e97-kube-api-access-b56d6\") pod \"barbican-db-create-l585m\" (UID: \"4fd85b6a-57be-47d8-955d-187926600e97\") " pod="openstack/barbican-db-create-l585m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.499666 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e93314-6c63-4319-bc2d-7ef3c6b917ec-operator-scripts\") pod \"cinder-c035-account-create-q6pbl\" (UID: \"a0e93314-6c63-4319-bc2d-7ef3c6b917ec\") " pod="openstack/cinder-c035-account-create-q6pbl" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.499735 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db3a0f98-33b9-450d-91e4-6575a60cfb2e-operator-scripts\") pod \"barbican-d52e-account-create-p7ns9\" (UID: \"db3a0f98-33b9-450d-91e4-6575a60cfb2e\") " pod="openstack/barbican-d52e-account-create-p7ns9" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.499930 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8lz5\" (UniqueName: \"kubernetes.io/projected/4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550-kube-api-access-w8lz5\") pod \"cinder-db-create-k9nfs\" (UID: \"4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550\") " pod="openstack/cinder-db-create-k9nfs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.500089 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550-operator-scripts\") pod \"cinder-db-create-k9nfs\" (UID: \"4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550\") " pod="openstack/cinder-db-create-k9nfs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.519427 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8lz5\" (UniqueName: \"kubernetes.io/projected/4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550-kube-api-access-w8lz5\") pod \"cinder-db-create-k9nfs\" (UID: \"4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550\") " pod="openstack/cinder-db-create-k9nfs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.563678 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k9nfs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.601472 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db3a0f98-33b9-450d-91e4-6575a60cfb2e-operator-scripts\") pod \"barbican-d52e-account-create-p7ns9\" (UID: \"db3a0f98-33b9-450d-91e4-6575a60cfb2e\") " pod="openstack/barbican-d52e-account-create-p7ns9" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.601586 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fd85b6a-57be-47d8-955d-187926600e97-operator-scripts\") pod \"barbican-db-create-l585m\" (UID: \"4fd85b6a-57be-47d8-955d-187926600e97\") " pod="openstack/barbican-db-create-l585m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.601624 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46v6s\" (UniqueName: \"kubernetes.io/projected/db3a0f98-33b9-450d-91e4-6575a60cfb2e-kube-api-access-46v6s\") pod \"barbican-d52e-account-create-p7ns9\" (UID: \"db3a0f98-33b9-450d-91e4-6575a60cfb2e\") " pod="openstack/barbican-d52e-account-create-p7ns9" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.601657 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79tpx\" (UniqueName: \"kubernetes.io/projected/a0e93314-6c63-4319-bc2d-7ef3c6b917ec-kube-api-access-79tpx\") pod \"cinder-c035-account-create-q6pbl\" (UID: \"a0e93314-6c63-4319-bc2d-7ef3c6b917ec\") " pod="openstack/cinder-c035-account-create-q6pbl" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.601687 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b56d6\" (UniqueName: \"kubernetes.io/projected/4fd85b6a-57be-47d8-955d-187926600e97-kube-api-access-b56d6\") pod \"barbican-db-create-l585m\" (UID: \"4fd85b6a-57be-47d8-955d-187926600e97\") " pod="openstack/barbican-db-create-l585m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.601721 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e93314-6c63-4319-bc2d-7ef3c6b917ec-operator-scripts\") pod \"cinder-c035-account-create-q6pbl\" (UID: \"a0e93314-6c63-4319-bc2d-7ef3c6b917ec\") " pod="openstack/cinder-c035-account-create-q6pbl" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.602580 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e93314-6c63-4319-bc2d-7ef3c6b917ec-operator-scripts\") pod \"cinder-c035-account-create-q6pbl\" (UID: \"a0e93314-6c63-4319-bc2d-7ef3c6b917ec\") " pod="openstack/cinder-c035-account-create-q6pbl" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.603089 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db3a0f98-33b9-450d-91e4-6575a60cfb2e-operator-scripts\") pod \"barbican-d52e-account-create-p7ns9\" (UID: \"db3a0f98-33b9-450d-91e4-6575a60cfb2e\") " pod="openstack/barbican-d52e-account-create-p7ns9" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.603909 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fd85b6a-57be-47d8-955d-187926600e97-operator-scripts\") pod \"barbican-db-create-l585m\" (UID: \"4fd85b6a-57be-47d8-955d-187926600e97\") " pod="openstack/barbican-db-create-l585m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.618228 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46v6s\" (UniqueName: \"kubernetes.io/projected/db3a0f98-33b9-450d-91e4-6575a60cfb2e-kube-api-access-46v6s\") pod \"barbican-d52e-account-create-p7ns9\" (UID: \"db3a0f98-33b9-450d-91e4-6575a60cfb2e\") " pod="openstack/barbican-d52e-account-create-p7ns9" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.619406 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79tpx\" (UniqueName: \"kubernetes.io/projected/a0e93314-6c63-4319-bc2d-7ef3c6b917ec-kube-api-access-79tpx\") pod \"cinder-c035-account-create-q6pbl\" (UID: \"a0e93314-6c63-4319-bc2d-7ef3c6b917ec\") " pod="openstack/cinder-c035-account-create-q6pbl" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.621528 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b56d6\" (UniqueName: \"kubernetes.io/projected/4fd85b6a-57be-47d8-955d-187926600e97-kube-api-access-b56d6\") pod \"barbican-db-create-l585m\" (UID: \"4fd85b6a-57be-47d8-955d-187926600e97\") " pod="openstack/barbican-db-create-l585m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.651938 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-t4vt4"] Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.652910 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t4vt4" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.659071 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4b13-account-create-4626m"] Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.659891 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4b13-account-create-4626m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.663199 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.664100 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-t4vt4"] Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.681584 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l585m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.683585 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4b13-account-create-4626m"] Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.708534 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c84c12-1726-494a-91ab-598ce15287ae-operator-scripts\") pod \"neutron-db-create-t4vt4\" (UID: \"76c84c12-1726-494a-91ab-598ce15287ae\") " pod="openstack/neutron-db-create-t4vt4" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.708623 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mdmg\" (UniqueName: \"kubernetes.io/projected/76c84c12-1726-494a-91ab-598ce15287ae-kube-api-access-7mdmg\") pod \"neutron-db-create-t4vt4\" (UID: \"76c84c12-1726-494a-91ab-598ce15287ae\") " pod="openstack/neutron-db-create-t4vt4" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.708854 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d52e-account-create-p7ns9" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.722877 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-4w2gs"] Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.724328 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4w2gs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.726520 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.726520 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.728314 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.728477 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sw8dk" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.733408 4563 generic.go:334] "Generic (PLEG): container finished" podID="dc02b6bb-89e5-4fce-899c-d917e73312fe" containerID="7fc5adf7e7dd334c85a9401b77c6e1ba3a7cc1d64970081f9bb4a7c460841bbf" exitCode=0 Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.733741 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qtfnl-config-5w8bg" event={"ID":"dc02b6bb-89e5-4fce-899c-d917e73312fe","Type":"ContainerDied","Data":"7fc5adf7e7dd334c85a9401b77c6e1ba3a7cc1d64970081f9bb4a7c460841bbf"} Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.735117 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4w2gs"] Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.770490 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c035-account-create-q6pbl" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.810288 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1190dd8-6aef-4116-be7f-e498cfe0db11-combined-ca-bundle\") pod \"keystone-db-sync-4w2gs\" (UID: \"d1190dd8-6aef-4116-be7f-e498cfe0db11\") " pod="openstack/keystone-db-sync-4w2gs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.810383 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8pp6\" (UniqueName: \"kubernetes.io/projected/26761262-f331-4bde-8b02-ff48aa5f3875-kube-api-access-g8pp6\") pod \"neutron-4b13-account-create-4626m\" (UID: \"26761262-f331-4bde-8b02-ff48aa5f3875\") " pod="openstack/neutron-4b13-account-create-4626m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.810445 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26761262-f331-4bde-8b02-ff48aa5f3875-operator-scripts\") pod \"neutron-4b13-account-create-4626m\" (UID: \"26761262-f331-4bde-8b02-ff48aa5f3875\") " pod="openstack/neutron-4b13-account-create-4626m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.810503 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c84c12-1726-494a-91ab-598ce15287ae-operator-scripts\") pod \"neutron-db-create-t4vt4\" (UID: \"76c84c12-1726-494a-91ab-598ce15287ae\") " pod="openstack/neutron-db-create-t4vt4" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.810527 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1190dd8-6aef-4116-be7f-e498cfe0db11-config-data\") pod \"keystone-db-sync-4w2gs\" (UID: \"d1190dd8-6aef-4116-be7f-e498cfe0db11\") " pod="openstack/keystone-db-sync-4w2gs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.810607 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mdmg\" (UniqueName: \"kubernetes.io/projected/76c84c12-1726-494a-91ab-598ce15287ae-kube-api-access-7mdmg\") pod \"neutron-db-create-t4vt4\" (UID: \"76c84c12-1726-494a-91ab-598ce15287ae\") " pod="openstack/neutron-db-create-t4vt4" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.810658 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cznvl\" (UniqueName: \"kubernetes.io/projected/d1190dd8-6aef-4116-be7f-e498cfe0db11-kube-api-access-cznvl\") pod \"keystone-db-sync-4w2gs\" (UID: \"d1190dd8-6aef-4116-be7f-e498cfe0db11\") " pod="openstack/keystone-db-sync-4w2gs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.811424 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c84c12-1726-494a-91ab-598ce15287ae-operator-scripts\") pod \"neutron-db-create-t4vt4\" (UID: \"76c84c12-1726-494a-91ab-598ce15287ae\") " pod="openstack/neutron-db-create-t4vt4" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.827904 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mdmg\" (UniqueName: \"kubernetes.io/projected/76c84c12-1726-494a-91ab-598ce15287ae-kube-api-access-7mdmg\") pod \"neutron-db-create-t4vt4\" (UID: \"76c84c12-1726-494a-91ab-598ce15287ae\") " pod="openstack/neutron-db-create-t4vt4" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.912264 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26761262-f331-4bde-8b02-ff48aa5f3875-operator-scripts\") pod \"neutron-4b13-account-create-4626m\" (UID: \"26761262-f331-4bde-8b02-ff48aa5f3875\") " pod="openstack/neutron-4b13-account-create-4626m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.912346 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1190dd8-6aef-4116-be7f-e498cfe0db11-config-data\") pod \"keystone-db-sync-4w2gs\" (UID: \"d1190dd8-6aef-4116-be7f-e498cfe0db11\") " pod="openstack/keystone-db-sync-4w2gs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.912459 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cznvl\" (UniqueName: \"kubernetes.io/projected/d1190dd8-6aef-4116-be7f-e498cfe0db11-kube-api-access-cznvl\") pod \"keystone-db-sync-4w2gs\" (UID: \"d1190dd8-6aef-4116-be7f-e498cfe0db11\") " pod="openstack/keystone-db-sync-4w2gs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.912493 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1190dd8-6aef-4116-be7f-e498cfe0db11-combined-ca-bundle\") pod \"keystone-db-sync-4w2gs\" (UID: \"d1190dd8-6aef-4116-be7f-e498cfe0db11\") " pod="openstack/keystone-db-sync-4w2gs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.912540 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8pp6\" (UniqueName: \"kubernetes.io/projected/26761262-f331-4bde-8b02-ff48aa5f3875-kube-api-access-g8pp6\") pod \"neutron-4b13-account-create-4626m\" (UID: \"26761262-f331-4bde-8b02-ff48aa5f3875\") " pod="openstack/neutron-4b13-account-create-4626m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.913077 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26761262-f331-4bde-8b02-ff48aa5f3875-operator-scripts\") pod \"neutron-4b13-account-create-4626m\" (UID: \"26761262-f331-4bde-8b02-ff48aa5f3875\") " pod="openstack/neutron-4b13-account-create-4626m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.915792 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1190dd8-6aef-4116-be7f-e498cfe0db11-config-data\") pod \"keystone-db-sync-4w2gs\" (UID: \"d1190dd8-6aef-4116-be7f-e498cfe0db11\") " pod="openstack/keystone-db-sync-4w2gs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.917596 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1190dd8-6aef-4116-be7f-e498cfe0db11-combined-ca-bundle\") pod \"keystone-db-sync-4w2gs\" (UID: \"d1190dd8-6aef-4116-be7f-e498cfe0db11\") " pod="openstack/keystone-db-sync-4w2gs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.926789 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cznvl\" (UniqueName: \"kubernetes.io/projected/d1190dd8-6aef-4116-be7f-e498cfe0db11-kube-api-access-cznvl\") pod \"keystone-db-sync-4w2gs\" (UID: \"d1190dd8-6aef-4116-be7f-e498cfe0db11\") " pod="openstack/keystone-db-sync-4w2gs" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.931139 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8pp6\" (UniqueName: \"kubernetes.io/projected/26761262-f331-4bde-8b02-ff48aa5f3875-kube-api-access-g8pp6\") pod \"neutron-4b13-account-create-4626m\" (UID: \"26761262-f331-4bde-8b02-ff48aa5f3875\") " pod="openstack/neutron-4b13-account-create-4626m" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.967947 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t4vt4" Nov 24 09:18:13 crc kubenswrapper[4563]: I1124 09:18:13.991526 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4b13-account-create-4626m" Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.045422 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4w2gs" Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.052741 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-k9nfs"] Nov 24 09:18:14 crc kubenswrapper[4563]: W1124 09:18:14.097719 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e633cdc_e6d0_4bb6_bfe2_c6c1f1fef550.slice/crio-cdefc5ae1fead29d83cf0d456aa431c53be66d4e65f77942819a3ef0fdd2c8cb WatchSource:0}: Error finding container cdefc5ae1fead29d83cf0d456aa431c53be66d4e65f77942819a3ef0fdd2c8cb: Status 404 returned error can't find the container with id cdefc5ae1fead29d83cf0d456aa431c53be66d4e65f77942819a3ef0fdd2c8cb Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.559069 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c035-account-create-q6pbl"] Nov 24 09:18:14 crc kubenswrapper[4563]: W1124 09:18:14.578052 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0e93314_6c63_4319_bc2d_7ef3c6b917ec.slice/crio-85fa88aca93bf21819498cec51a5685063e6ad8a943c2d525866420b09d85cae WatchSource:0}: Error finding container 85fa88aca93bf21819498cec51a5685063e6ad8a943c2d525866420b09d85cae: Status 404 returned error can't find the container with id 85fa88aca93bf21819498cec51a5685063e6ad8a943c2d525866420b09d85cae Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.666756 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-t4vt4"] Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.769789 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"c340cb15d90c10d034994ec5aa5a40f6db7b9780777d0f6a6b4ae00da3358834"} Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.769840 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"904b3d36bb8a36c0143559c25b76b73c9cb9b447b1ac5d55703a28773e9e25e0"} Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.775410 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t4vt4" event={"ID":"76c84c12-1726-494a-91ab-598ce15287ae","Type":"ContainerStarted","Data":"3d0e7cc1d08d09490e8871a933ce81f0e6236dfcc2d6b3872c7fb9e3702ef7d1"} Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.777062 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c035-account-create-q6pbl" event={"ID":"a0e93314-6c63-4319-bc2d-7ef3c6b917ec","Type":"ContainerStarted","Data":"85fa88aca93bf21819498cec51a5685063e6ad8a943c2d525866420b09d85cae"} Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.786003 4563 generic.go:334] "Generic (PLEG): container finished" podID="4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550" containerID="1eca9adbe2a535a1d64ad8ac8ca6fbde84a77d6dcc8242798f4621e2b7608740" exitCode=0 Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.786540 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k9nfs" event={"ID":"4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550","Type":"ContainerDied","Data":"1eca9adbe2a535a1d64ad8ac8ca6fbde84a77d6dcc8242798f4621e2b7608740"} Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.786586 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k9nfs" event={"ID":"4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550","Type":"ContainerStarted","Data":"cdefc5ae1fead29d83cf0d456aa431c53be66d4e65f77942819a3ef0fdd2c8cb"} Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.794633 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-c035-account-create-q6pbl" podStartSLOduration=1.794621247 podStartE2EDuration="1.794621247s" podCreationTimestamp="2025-11-24 09:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:14.789714721 +0000 UTC m=+872.048692169" watchObservedRunningTime="2025-11-24 09:18:14.794621247 +0000 UTC m=+872.053598684" Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.805884 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-l585m"] Nov 24 09:18:14 crc kubenswrapper[4563]: W1124 09:18:14.823557 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1190dd8_6aef_4116_be7f_e498cfe0db11.slice/crio-1c0060a633c96b076cb9e07e135404adab02bb4bc710233ec50afe9829775e97 WatchSource:0}: Error finding container 1c0060a633c96b076cb9e07e135404adab02bb4bc710233ec50afe9829775e97: Status 404 returned error can't find the container with id 1c0060a633c96b076cb9e07e135404adab02bb4bc710233ec50afe9829775e97 Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.826323 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4w2gs"] Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.875096 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d52e-account-create-p7ns9"] Nov 24 09:18:14 crc kubenswrapper[4563]: I1124 09:18:14.900359 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4b13-account-create-4626m"] Nov 24 09:18:14 crc kubenswrapper[4563]: W1124 09:18:14.915727 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26761262_f331_4bde_8b02_ff48aa5f3875.slice/crio-23fa8d84f8040dd494e8340fa5057d1f31a95688c187b1f47584b25ffa12fd35 WatchSource:0}: Error finding container 23fa8d84f8040dd494e8340fa5057d1f31a95688c187b1f47584b25ffa12fd35: Status 404 returned error can't find the container with id 23fa8d84f8040dd494e8340fa5057d1f31a95688c187b1f47584b25ffa12fd35 Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.211194 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.361515 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-run\") pod \"dc02b6bb-89e5-4fce-899c-d917e73312fe\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.361630 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-run" (OuterVolumeSpecName: "var-run") pod "dc02b6bb-89e5-4fce-899c-d917e73312fe" (UID: "dc02b6bb-89e5-4fce-899c-d917e73312fe"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.361716 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vppd\" (UniqueName: \"kubernetes.io/projected/dc02b6bb-89e5-4fce-899c-d917e73312fe-kube-api-access-4vppd\") pod \"dc02b6bb-89e5-4fce-899c-d917e73312fe\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.361755 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-run-ovn\") pod \"dc02b6bb-89e5-4fce-899c-d917e73312fe\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.361791 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc02b6bb-89e5-4fce-899c-d917e73312fe-scripts\") pod \"dc02b6bb-89e5-4fce-899c-d917e73312fe\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.361853 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "dc02b6bb-89e5-4fce-899c-d917e73312fe" (UID: "dc02b6bb-89e5-4fce-899c-d917e73312fe"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.361866 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dc02b6bb-89e5-4fce-899c-d917e73312fe-additional-scripts\") pod \"dc02b6bb-89e5-4fce-899c-d917e73312fe\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.361955 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-log-ovn\") pod \"dc02b6bb-89e5-4fce-899c-d917e73312fe\" (UID: \"dc02b6bb-89e5-4fce-899c-d917e73312fe\") " Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.362420 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc02b6bb-89e5-4fce-899c-d917e73312fe-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "dc02b6bb-89e5-4fce-899c-d917e73312fe" (UID: "dc02b6bb-89e5-4fce-899c-d917e73312fe"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.362452 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "dc02b6bb-89e5-4fce-899c-d917e73312fe" (UID: "dc02b6bb-89e5-4fce-899c-d917e73312fe"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.362669 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc02b6bb-89e5-4fce-899c-d917e73312fe-scripts" (OuterVolumeSpecName: "scripts") pod "dc02b6bb-89e5-4fce-899c-d917e73312fe" (UID: "dc02b6bb-89e5-4fce-899c-d917e73312fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.362698 4563 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.362713 4563 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dc02b6bb-89e5-4fce-899c-d917e73312fe-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.362722 4563 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.362730 4563 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc02b6bb-89e5-4fce-899c-d917e73312fe-var-run\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.367336 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc02b6bb-89e5-4fce-899c-d917e73312fe-kube-api-access-4vppd" (OuterVolumeSpecName: "kube-api-access-4vppd") pod "dc02b6bb-89e5-4fce-899c-d917e73312fe" (UID: "dc02b6bb-89e5-4fce-899c-d917e73312fe"). InnerVolumeSpecName "kube-api-access-4vppd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.465007 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vppd\" (UniqueName: \"kubernetes.io/projected/dc02b6bb-89e5-4fce-899c-d917e73312fe-kube-api-access-4vppd\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.465044 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc02b6bb-89e5-4fce-899c-d917e73312fe-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.535348 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qtfnl-config-5w8bg"] Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.549624 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-qtfnl-config-5w8bg"] Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.625491 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qtfnl-config-xk28x"] Nov 24 09:18:15 crc kubenswrapper[4563]: E1124 09:18:15.625822 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc02b6bb-89e5-4fce-899c-d917e73312fe" containerName="ovn-config" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.625835 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc02b6bb-89e5-4fce-899c-d917e73312fe" containerName="ovn-config" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.625993 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc02b6bb-89e5-4fce-899c-d917e73312fe" containerName="ovn-config" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.626439 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.634029 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qtfnl-config-xk28x"] Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.770555 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-run\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.770659 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-additional-scripts\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.770711 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-run-ovn\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.770835 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-scripts\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.770916 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bptgj\" (UniqueName: \"kubernetes.io/projected/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-kube-api-access-bptgj\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.770981 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-log-ovn\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.799663 4563 generic.go:334] "Generic (PLEG): container finished" podID="4fd85b6a-57be-47d8-955d-187926600e97" containerID="dc931fdc1a3b3855e7604863182b2f20b24224eeace1c806902e5b4024b07382" exitCode=0 Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.799760 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l585m" event={"ID":"4fd85b6a-57be-47d8-955d-187926600e97","Type":"ContainerDied","Data":"dc931fdc1a3b3855e7604863182b2f20b24224eeace1c806902e5b4024b07382"} Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.799799 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l585m" event={"ID":"4fd85b6a-57be-47d8-955d-187926600e97","Type":"ContainerStarted","Data":"c2a35c883f423671c9a24327292d938fb5981d2b79e5d5f29e795701f2f69a1d"} Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.800977 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4w2gs" event={"ID":"d1190dd8-6aef-4116-be7f-e498cfe0db11","Type":"ContainerStarted","Data":"1c0060a633c96b076cb9e07e135404adab02bb4bc710233ec50afe9829775e97"} Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.805604 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"b328457c393d238b13c5bf8d7cd167e83b91d9284ede9289cae346475f6536dd"} Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.805633 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"54345de4fa056004aa4c56626b94e117a23ab8597a41a11820e228e01a59e528"} Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.807303 4563 generic.go:334] "Generic (PLEG): container finished" podID="26761262-f331-4bde-8b02-ff48aa5f3875" containerID="289ebe41425706f766743fcb4109dbe0797f68d0730b502ebdd960199f36d52b" exitCode=0 Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.807360 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4b13-account-create-4626m" event={"ID":"26761262-f331-4bde-8b02-ff48aa5f3875","Type":"ContainerDied","Data":"289ebe41425706f766743fcb4109dbe0797f68d0730b502ebdd960199f36d52b"} Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.807376 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4b13-account-create-4626m" event={"ID":"26761262-f331-4bde-8b02-ff48aa5f3875","Type":"ContainerStarted","Data":"23fa8d84f8040dd494e8340fa5057d1f31a95688c187b1f47584b25ffa12fd35"} Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.809029 4563 generic.go:334] "Generic (PLEG): container finished" podID="76c84c12-1726-494a-91ab-598ce15287ae" containerID="38580ef16e0473e5a0968c04ed2852b6e8f41ec8ad01508bb9978bc7b660f66d" exitCode=0 Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.809077 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t4vt4" event={"ID":"76c84c12-1726-494a-91ab-598ce15287ae","Type":"ContainerDied","Data":"38580ef16e0473e5a0968c04ed2852b6e8f41ec8ad01508bb9978bc7b660f66d"} Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.810532 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qtfnl-config-5w8bg" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.810557 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="629e50b73c7d43f26413fe2f478444ad0a423de7284796b1720d4433ce07b8de" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.812100 4563 generic.go:334] "Generic (PLEG): container finished" podID="a0e93314-6c63-4319-bc2d-7ef3c6b917ec" containerID="0bb3afca1b8eb24e5748089f4bde24844a4d21f3f69bc044ff2f09fa889e9655" exitCode=0 Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.812169 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c035-account-create-q6pbl" event={"ID":"a0e93314-6c63-4319-bc2d-7ef3c6b917ec","Type":"ContainerDied","Data":"0bb3afca1b8eb24e5748089f4bde24844a4d21f3f69bc044ff2f09fa889e9655"} Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.819852 4563 generic.go:334] "Generic (PLEG): container finished" podID="db3a0f98-33b9-450d-91e4-6575a60cfb2e" containerID="c1f9e4633c7081ec9d9e372f4eac02386363375477b604311b4e4cdde2fdaba4" exitCode=0 Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.820136 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d52e-account-create-p7ns9" event={"ID":"db3a0f98-33b9-450d-91e4-6575a60cfb2e","Type":"ContainerDied","Data":"c1f9e4633c7081ec9d9e372f4eac02386363375477b604311b4e4cdde2fdaba4"} Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.820174 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d52e-account-create-p7ns9" event={"ID":"db3a0f98-33b9-450d-91e4-6575a60cfb2e","Type":"ContainerStarted","Data":"8dab950d3e3defb8f4a6140b719434e9cfb12d143040ecf9b116d5727567de7d"} Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.872526 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-scripts\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.872871 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bptgj\" (UniqueName: \"kubernetes.io/projected/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-kube-api-access-bptgj\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.872939 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-log-ovn\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.873022 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-run\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.873076 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-additional-scripts\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.873119 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-run-ovn\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.873357 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-run-ovn\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.873542 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-log-ovn\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.873845 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-run\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.874476 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-additional-scripts\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.875145 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-scripts\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.895089 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bptgj\" (UniqueName: \"kubernetes.io/projected/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-kube-api-access-bptgj\") pod \"ovn-controller-qtfnl-config-xk28x\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:15 crc kubenswrapper[4563]: I1124 09:18:15.943736 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:16 crc kubenswrapper[4563]: I1124 09:18:16.134021 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k9nfs" Nov 24 09:18:16 crc kubenswrapper[4563]: I1124 09:18:16.281253 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8lz5\" (UniqueName: \"kubernetes.io/projected/4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550-kube-api-access-w8lz5\") pod \"4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550\" (UID: \"4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550\") " Nov 24 09:18:16 crc kubenswrapper[4563]: I1124 09:18:16.281327 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550-operator-scripts\") pod \"4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550\" (UID: \"4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550\") " Nov 24 09:18:16 crc kubenswrapper[4563]: I1124 09:18:16.282094 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550" (UID: "4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:16 crc kubenswrapper[4563]: I1124 09:18:16.286042 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550-kube-api-access-w8lz5" (OuterVolumeSpecName: "kube-api-access-w8lz5") pod "4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550" (UID: "4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550"). InnerVolumeSpecName "kube-api-access-w8lz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:16 crc kubenswrapper[4563]: I1124 09:18:16.383863 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:16 crc kubenswrapper[4563]: I1124 09:18:16.383899 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8lz5\" (UniqueName: \"kubernetes.io/projected/4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550-kube-api-access-w8lz5\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:16 crc kubenswrapper[4563]: I1124 09:18:16.410845 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qtfnl-config-xk28x"] Nov 24 09:18:16 crc kubenswrapper[4563]: W1124 09:18:16.532993 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9196b8a0_3cfc_4189_bb0d_ad19c1e1a2f3.slice/crio-eb367c76a129cf3a11d68f67cb3c135e18779a938f32b20bf499dc79bca11a08 WatchSource:0}: Error finding container eb367c76a129cf3a11d68f67cb3c135e18779a938f32b20bf499dc79bca11a08: Status 404 returned error can't find the container with id eb367c76a129cf3a11d68f67cb3c135e18779a938f32b20bf499dc79bca11a08 Nov 24 09:18:16 crc kubenswrapper[4563]: I1124 09:18:16.859099 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qtfnl-config-xk28x" event={"ID":"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3","Type":"ContainerStarted","Data":"eb367c76a129cf3a11d68f67cb3c135e18779a938f32b20bf499dc79bca11a08"} Nov 24 09:18:16 crc kubenswrapper[4563]: I1124 09:18:16.871364 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"d9adb334d58d2389ba93e1556b9df4dae77b0b4ca3e55b7e5ee94166e9051c7d"} Nov 24 09:18:16 crc kubenswrapper[4563]: I1124 09:18:16.872923 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k9nfs" event={"ID":"4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550","Type":"ContainerDied","Data":"cdefc5ae1fead29d83cf0d456aa431c53be66d4e65f77942819a3ef0fdd2c8cb"} Nov 24 09:18:16 crc kubenswrapper[4563]: I1124 09:18:16.872966 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdefc5ae1fead29d83cf0d456aa431c53be66d4e65f77942819a3ef0fdd2c8cb" Nov 24 09:18:16 crc kubenswrapper[4563]: I1124 09:18:16.873142 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k9nfs" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.070568 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc02b6bb-89e5-4fce-899c-d917e73312fe" path="/var/lib/kubelet/pods/dc02b6bb-89e5-4fce-899c-d917e73312fe/volumes" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.225347 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-qtfnl" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.277010 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d52e-account-create-p7ns9" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.399411 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l585m" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.414451 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4b13-account-create-4626m" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.419792 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c035-account-create-q6pbl" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.420148 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46v6s\" (UniqueName: \"kubernetes.io/projected/db3a0f98-33b9-450d-91e4-6575a60cfb2e-kube-api-access-46v6s\") pod \"db3a0f98-33b9-450d-91e4-6575a60cfb2e\" (UID: \"db3a0f98-33b9-450d-91e4-6575a60cfb2e\") " Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.420391 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db3a0f98-33b9-450d-91e4-6575a60cfb2e-operator-scripts\") pod \"db3a0f98-33b9-450d-91e4-6575a60cfb2e\" (UID: \"db3a0f98-33b9-450d-91e4-6575a60cfb2e\") " Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.421060 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db3a0f98-33b9-450d-91e4-6575a60cfb2e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db3a0f98-33b9-450d-91e4-6575a60cfb2e" (UID: "db3a0f98-33b9-450d-91e4-6575a60cfb2e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.421342 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t4vt4" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.432281 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db3a0f98-33b9-450d-91e4-6575a60cfb2e-kube-api-access-46v6s" (OuterVolumeSpecName: "kube-api-access-46v6s") pod "db3a0f98-33b9-450d-91e4-6575a60cfb2e" (UID: "db3a0f98-33b9-450d-91e4-6575a60cfb2e"). InnerVolumeSpecName "kube-api-access-46v6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.522146 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79tpx\" (UniqueName: \"kubernetes.io/projected/a0e93314-6c63-4319-bc2d-7ef3c6b917ec-kube-api-access-79tpx\") pod \"a0e93314-6c63-4319-bc2d-7ef3c6b917ec\" (UID: \"a0e93314-6c63-4319-bc2d-7ef3c6b917ec\") " Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.522284 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e93314-6c63-4319-bc2d-7ef3c6b917ec-operator-scripts\") pod \"a0e93314-6c63-4319-bc2d-7ef3c6b917ec\" (UID: \"a0e93314-6c63-4319-bc2d-7ef3c6b917ec\") " Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.522350 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mdmg\" (UniqueName: \"kubernetes.io/projected/76c84c12-1726-494a-91ab-598ce15287ae-kube-api-access-7mdmg\") pod \"76c84c12-1726-494a-91ab-598ce15287ae\" (UID: \"76c84c12-1726-494a-91ab-598ce15287ae\") " Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.522399 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8pp6\" (UniqueName: \"kubernetes.io/projected/26761262-f331-4bde-8b02-ff48aa5f3875-kube-api-access-g8pp6\") pod \"26761262-f331-4bde-8b02-ff48aa5f3875\" (UID: \"26761262-f331-4bde-8b02-ff48aa5f3875\") " Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.522465 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26761262-f331-4bde-8b02-ff48aa5f3875-operator-scripts\") pod \"26761262-f331-4bde-8b02-ff48aa5f3875\" (UID: \"26761262-f331-4bde-8b02-ff48aa5f3875\") " Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.522484 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fd85b6a-57be-47d8-955d-187926600e97-operator-scripts\") pod \"4fd85b6a-57be-47d8-955d-187926600e97\" (UID: \"4fd85b6a-57be-47d8-955d-187926600e97\") " Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.522528 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c84c12-1726-494a-91ab-598ce15287ae-operator-scripts\") pod \"76c84c12-1726-494a-91ab-598ce15287ae\" (UID: \"76c84c12-1726-494a-91ab-598ce15287ae\") " Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.522552 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b56d6\" (UniqueName: \"kubernetes.io/projected/4fd85b6a-57be-47d8-955d-187926600e97-kube-api-access-b56d6\") pod \"4fd85b6a-57be-47d8-955d-187926600e97\" (UID: \"4fd85b6a-57be-47d8-955d-187926600e97\") " Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.522881 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e93314-6c63-4319-bc2d-7ef3c6b917ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0e93314-6c63-4319-bc2d-7ef3c6b917ec" (UID: "a0e93314-6c63-4319-bc2d-7ef3c6b917ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.523073 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26761262-f331-4bde-8b02-ff48aa5f3875-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26761262-f331-4bde-8b02-ff48aa5f3875" (UID: "26761262-f331-4bde-8b02-ff48aa5f3875"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.523089 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c84c12-1726-494a-91ab-598ce15287ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76c84c12-1726-494a-91ab-598ce15287ae" (UID: "76c84c12-1726-494a-91ab-598ce15287ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.523725 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fd85b6a-57be-47d8-955d-187926600e97-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4fd85b6a-57be-47d8-955d-187926600e97" (UID: "4fd85b6a-57be-47d8-955d-187926600e97"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.524253 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c84c12-1726-494a-91ab-598ce15287ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.524275 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46v6s\" (UniqueName: \"kubernetes.io/projected/db3a0f98-33b9-450d-91e4-6575a60cfb2e-kube-api-access-46v6s\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.524290 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e93314-6c63-4319-bc2d-7ef3c6b917ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.524309 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26761262-f331-4bde-8b02-ff48aa5f3875-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.524317 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fd85b6a-57be-47d8-955d-187926600e97-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.524327 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db3a0f98-33b9-450d-91e4-6575a60cfb2e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.525746 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76c84c12-1726-494a-91ab-598ce15287ae-kube-api-access-7mdmg" (OuterVolumeSpecName: "kube-api-access-7mdmg") pod "76c84c12-1726-494a-91ab-598ce15287ae" (UID: "76c84c12-1726-494a-91ab-598ce15287ae"). InnerVolumeSpecName "kube-api-access-7mdmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.526422 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e93314-6c63-4319-bc2d-7ef3c6b917ec-kube-api-access-79tpx" (OuterVolumeSpecName: "kube-api-access-79tpx") pod "a0e93314-6c63-4319-bc2d-7ef3c6b917ec" (UID: "a0e93314-6c63-4319-bc2d-7ef3c6b917ec"). InnerVolumeSpecName "kube-api-access-79tpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.526502 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fd85b6a-57be-47d8-955d-187926600e97-kube-api-access-b56d6" (OuterVolumeSpecName: "kube-api-access-b56d6") pod "4fd85b6a-57be-47d8-955d-187926600e97" (UID: "4fd85b6a-57be-47d8-955d-187926600e97"). InnerVolumeSpecName "kube-api-access-b56d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.528232 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26761262-f331-4bde-8b02-ff48aa5f3875-kube-api-access-g8pp6" (OuterVolumeSpecName: "kube-api-access-g8pp6") pod "26761262-f331-4bde-8b02-ff48aa5f3875" (UID: "26761262-f331-4bde-8b02-ff48aa5f3875"). InnerVolumeSpecName "kube-api-access-g8pp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.626766 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b56d6\" (UniqueName: \"kubernetes.io/projected/4fd85b6a-57be-47d8-955d-187926600e97-kube-api-access-b56d6\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.626808 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79tpx\" (UniqueName: \"kubernetes.io/projected/a0e93314-6c63-4319-bc2d-7ef3c6b917ec-kube-api-access-79tpx\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.626819 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mdmg\" (UniqueName: \"kubernetes.io/projected/76c84c12-1726-494a-91ab-598ce15287ae-kube-api-access-7mdmg\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.626829 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8pp6\" (UniqueName: \"kubernetes.io/projected/26761262-f331-4bde-8b02-ff48aa5f3875-kube-api-access-g8pp6\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.885920 4563 generic.go:334] "Generic (PLEG): container finished" podID="9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3" containerID="d9d022498ed0865c43bb0bed4db10fba9b82ea28d4406bb092ba58ed744d314c" exitCode=0 Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.886285 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qtfnl-config-xk28x" event={"ID":"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3","Type":"ContainerDied","Data":"d9d022498ed0865c43bb0bed4db10fba9b82ea28d4406bb092ba58ed744d314c"} Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.905785 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"b1b75edeccbd39b866f09a836987ba0facbeefefc33ffdd8b2f12625c9abeb8b"} Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.906091 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"558e3c7613d2b90d1a7c658d6aec623547685b75b5503629af9846676d85d65f"} Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.906104 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"355a711bb71a63486607c8c44558005cac4d677b12362b3e2073161894e1a78a"} Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.906112 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"40b02f35fa39410d69a9ce060798f0b7fd38a71c322895c5cc0310a364be0d4a"} Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.908588 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4b13-account-create-4626m" event={"ID":"26761262-f331-4bde-8b02-ff48aa5f3875","Type":"ContainerDied","Data":"23fa8d84f8040dd494e8340fa5057d1f31a95688c187b1f47584b25ffa12fd35"} Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.908619 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23fa8d84f8040dd494e8340fa5057d1f31a95688c187b1f47584b25ffa12fd35" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.908589 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4b13-account-create-4626m" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.911855 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t4vt4" event={"ID":"76c84c12-1726-494a-91ab-598ce15287ae","Type":"ContainerDied","Data":"3d0e7cc1d08d09490e8871a933ce81f0e6236dfcc2d6b3872c7fb9e3702ef7d1"} Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.911879 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d0e7cc1d08d09490e8871a933ce81f0e6236dfcc2d6b3872c7fb9e3702ef7d1" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.911935 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t4vt4" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.913753 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c035-account-create-q6pbl" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.913759 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c035-account-create-q6pbl" event={"ID":"a0e93314-6c63-4319-bc2d-7ef3c6b917ec","Type":"ContainerDied","Data":"85fa88aca93bf21819498cec51a5685063e6ad8a943c2d525866420b09d85cae"} Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.913790 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85fa88aca93bf21819498cec51a5685063e6ad8a943c2d525866420b09d85cae" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.915264 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d52e-account-create-p7ns9" event={"ID":"db3a0f98-33b9-450d-91e4-6575a60cfb2e","Type":"ContainerDied","Data":"8dab950d3e3defb8f4a6140b719434e9cfb12d143040ecf9b116d5727567de7d"} Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.915294 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d52e-account-create-p7ns9" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.915314 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dab950d3e3defb8f4a6140b719434e9cfb12d143040ecf9b116d5727567de7d" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.916661 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-l585m" event={"ID":"4fd85b6a-57be-47d8-955d-187926600e97","Type":"ContainerDied","Data":"c2a35c883f423671c9a24327292d938fb5981d2b79e5d5f29e795701f2f69a1d"} Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.916687 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2a35c883f423671c9a24327292d938fb5981d2b79e5d5f29e795701f2f69a1d" Nov 24 09:18:17 crc kubenswrapper[4563]: I1124 09:18:17.916688 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-l585m" Nov 24 09:18:18 crc kubenswrapper[4563]: I1124 09:18:18.927561 4563 generic.go:334] "Generic (PLEG): container finished" podID="20286861-2552-4ff4-a5a1-3d67c9e0cb7b" containerID="cf36b861866fe18ca83c30e811f07dfc6934b6b4d377c37c561572c96f7bfff6" exitCode=0 Nov 24 09:18:18 crc kubenswrapper[4563]: I1124 09:18:18.927656 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6msdh" event={"ID":"20286861-2552-4ff4-a5a1-3d67c9e0cb7b","Type":"ContainerDied","Data":"cf36b861866fe18ca83c30e811f07dfc6934b6b4d377c37c561572c96f7bfff6"} Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.817397 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.966314 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-log-ovn\") pod \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.967124 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-run\") pod \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.967457 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-additional-scripts\") pod \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.967570 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-run-ovn\") pod \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.967682 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bptgj\" (UniqueName: \"kubernetes.io/projected/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-kube-api-access-bptgj\") pod \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.967774 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-scripts\") pod \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\" (UID: \"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3\") " Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.968686 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3" (UID: "9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.968753 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3" (UID: "9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.968781 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3" (UID: "9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.968776 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-run" (OuterVolumeSpecName: "var-run") pod "9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3" (UID: "9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.969418 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-scripts" (OuterVolumeSpecName: "scripts") pod "9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3" (UID: "9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.971522 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qtfnl-config-xk28x" Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.971737 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qtfnl-config-xk28x" event={"ID":"9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3","Type":"ContainerDied","Data":"eb367c76a129cf3a11d68f67cb3c135e18779a938f32b20bf499dc79bca11a08"} Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.971817 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb367c76a129cf3a11d68f67cb3c135e18779a938f32b20bf499dc79bca11a08" Nov 24 09:18:19 crc kubenswrapper[4563]: I1124 09:18:19.982357 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-kube-api-access-bptgj" (OuterVolumeSpecName: "kube-api-access-bptgj") pod "9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3" (UID: "9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3"). InnerVolumeSpecName "kube-api-access-bptgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.070330 4563 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.070388 4563 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-run\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.070400 4563 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.070414 4563 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.070422 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bptgj\" (UniqueName: \"kubernetes.io/projected/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-kube-api-access-bptgj\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.070431 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.272101 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.375465 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-db-sync-config-data\") pod \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.375544 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-config-data\") pod \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.375581 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-combined-ca-bundle\") pod \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.376138 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf6wh\" (UniqueName: \"kubernetes.io/projected/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-kube-api-access-hf6wh\") pod \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\" (UID: \"20286861-2552-4ff4-a5a1-3d67c9e0cb7b\") " Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.379858 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "20286861-2552-4ff4-a5a1-3d67c9e0cb7b" (UID: "20286861-2552-4ff4-a5a1-3d67c9e0cb7b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.379958 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-kube-api-access-hf6wh" (OuterVolumeSpecName: "kube-api-access-hf6wh") pod "20286861-2552-4ff4-a5a1-3d67c9e0cb7b" (UID: "20286861-2552-4ff4-a5a1-3d67c9e0cb7b"). InnerVolumeSpecName "kube-api-access-hf6wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.396420 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20286861-2552-4ff4-a5a1-3d67c9e0cb7b" (UID: "20286861-2552-4ff4-a5a1-3d67c9e0cb7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.412204 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-config-data" (OuterVolumeSpecName: "config-data") pod "20286861-2552-4ff4-a5a1-3d67c9e0cb7b" (UID: "20286861-2552-4ff4-a5a1-3d67c9e0cb7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.477993 4563 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.478025 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.478035 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.478043 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf6wh\" (UniqueName: \"kubernetes.io/projected/20286861-2552-4ff4-a5a1-3d67c9e0cb7b-kube-api-access-hf6wh\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.899614 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qtfnl-config-xk28x"] Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.906575 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-qtfnl-config-xk28x"] Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.982239 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6msdh" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.982252 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6msdh" event={"ID":"20286861-2552-4ff4-a5a1-3d67c9e0cb7b","Type":"ContainerDied","Data":"c2e03d99ef699a09e98ef4f5039f0b9d3dc475b18d1422c5779806dcf5972bb4"} Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.982345 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2e03d99ef699a09e98ef4f5039f0b9d3dc475b18d1422c5779806dcf5972bb4" Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.984140 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4w2gs" event={"ID":"d1190dd8-6aef-4116-be7f-e498cfe0db11","Type":"ContainerStarted","Data":"d3c46f2138d7e9e9519753124f1b46db0e7ec960bdf134ed0955e1c1a103c617"} Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.990896 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"0f20995e132c9bdee4c6aa8a7079a341d784faaeb7e4a89ae698890c20a6592c"} Nov 24 09:18:20 crc kubenswrapper[4563]: I1124 09:18:20.990927 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9d44fee0-139c-42c9-8ad1-3991121f1d67","Type":"ContainerStarted","Data":"813671abfcaa7eb3ed22d547cee65e5131182a9aa473926f328814f55930bee1"} Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.009325 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-4w2gs" podStartSLOduration=3.200855083 podStartE2EDuration="8.00929955s" podCreationTimestamp="2025-11-24 09:18:13 +0000 UTC" firstStartedPulling="2025-11-24 09:18:14.836485173 +0000 UTC m=+872.095462620" lastFinishedPulling="2025-11-24 09:18:19.64492964 +0000 UTC m=+876.903907087" observedRunningTime="2025-11-24 09:18:21.003508757 +0000 UTC m=+878.262486204" watchObservedRunningTime="2025-11-24 09:18:21.00929955 +0000 UTC m=+878.268276997" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.047045 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=22.392581885 podStartE2EDuration="33.047015232s" podCreationTimestamp="2025-11-24 09:17:48 +0000 UTC" firstStartedPulling="2025-11-24 09:18:05.896962025 +0000 UTC m=+863.155939482" lastFinishedPulling="2025-11-24 09:18:16.551395382 +0000 UTC m=+873.810372829" observedRunningTime="2025-11-24 09:18:21.036392445 +0000 UTC m=+878.295369891" watchObservedRunningTime="2025-11-24 09:18:21.047015232 +0000 UTC m=+878.305992669" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.066803 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3" path="/var/lib/kubelet/pods/9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3/volumes" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218324 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84ddf475bf-48vsf"] Nov 24 09:18:21 crc kubenswrapper[4563]: E1124 09:18:21.218655 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76c84c12-1726-494a-91ab-598ce15287ae" containerName="mariadb-database-create" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218674 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="76c84c12-1726-494a-91ab-598ce15287ae" containerName="mariadb-database-create" Nov 24 09:18:21 crc kubenswrapper[4563]: E1124 09:18:21.218693 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550" containerName="mariadb-database-create" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218699 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550" containerName="mariadb-database-create" Nov 24 09:18:21 crc kubenswrapper[4563]: E1124 09:18:21.218710 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3" containerName="ovn-config" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218714 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3" containerName="ovn-config" Nov 24 09:18:21 crc kubenswrapper[4563]: E1124 09:18:21.218724 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd85b6a-57be-47d8-955d-187926600e97" containerName="mariadb-database-create" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218729 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd85b6a-57be-47d8-955d-187926600e97" containerName="mariadb-database-create" Nov 24 09:18:21 crc kubenswrapper[4563]: E1124 09:18:21.218738 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e93314-6c63-4319-bc2d-7ef3c6b917ec" containerName="mariadb-account-create" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218745 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e93314-6c63-4319-bc2d-7ef3c6b917ec" containerName="mariadb-account-create" Nov 24 09:18:21 crc kubenswrapper[4563]: E1124 09:18:21.218757 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db3a0f98-33b9-450d-91e4-6575a60cfb2e" containerName="mariadb-account-create" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218763 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="db3a0f98-33b9-450d-91e4-6575a60cfb2e" containerName="mariadb-account-create" Nov 24 09:18:21 crc kubenswrapper[4563]: E1124 09:18:21.218772 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20286861-2552-4ff4-a5a1-3d67c9e0cb7b" containerName="glance-db-sync" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218777 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="20286861-2552-4ff4-a5a1-3d67c9e0cb7b" containerName="glance-db-sync" Nov 24 09:18:21 crc kubenswrapper[4563]: E1124 09:18:21.218784 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26761262-f331-4bde-8b02-ff48aa5f3875" containerName="mariadb-account-create" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218791 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="26761262-f331-4bde-8b02-ff48aa5f3875" containerName="mariadb-account-create" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218938 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e93314-6c63-4319-bc2d-7ef3c6b917ec" containerName="mariadb-account-create" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218952 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="9196b8a0-3cfc-4189-bb0d-ad19c1e1a2f3" containerName="ovn-config" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218964 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550" containerName="mariadb-database-create" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218974 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd85b6a-57be-47d8-955d-187926600e97" containerName="mariadb-database-create" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218984 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="db3a0f98-33b9-450d-91e4-6575a60cfb2e" containerName="mariadb-account-create" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.218993 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="26761262-f331-4bde-8b02-ff48aa5f3875" containerName="mariadb-account-create" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.219007 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="20286861-2552-4ff4-a5a1-3d67c9e0cb7b" containerName="glance-db-sync" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.219014 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="76c84c12-1726-494a-91ab-598ce15287ae" containerName="mariadb-database-create" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.219789 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.233855 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84ddf475bf-48vsf"] Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.290052 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-ovsdbserver-nb\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.290108 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch5dg\" (UniqueName: \"kubernetes.io/projected/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-kube-api-access-ch5dg\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.290159 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-dns-svc\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.290190 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-config\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.290266 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-ovsdbserver-sb\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.375001 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84ddf475bf-48vsf"] Nov 24 09:18:21 crc kubenswrapper[4563]: E1124 09:18:21.376071 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-ch5dg ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" podUID="9afed3b6-6377-4699-8ed3-ffbcff7b1c13" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.392141 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-ovsdbserver-nb\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.392326 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch5dg\" (UniqueName: \"kubernetes.io/projected/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-kube-api-access-ch5dg\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.392441 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-dns-svc\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.392552 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-config\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.392703 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-ovsdbserver-sb\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.393158 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-ovsdbserver-nb\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.393257 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-dns-svc\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.393446 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-config\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.393503 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-ovsdbserver-sb\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.410135 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6856c564b9-8cxgs"] Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.411383 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.413005 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.416399 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6856c564b9-8cxgs"] Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.429094 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch5dg\" (UniqueName: \"kubernetes.io/projected/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-kube-api-access-ch5dg\") pod \"dnsmasq-dns-84ddf475bf-48vsf\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.494895 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-ovsdbserver-sb\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.494947 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-config\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.495019 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-ovsdbserver-nb\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.495084 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-dns-svc\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.495110 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-dns-swift-storage-0\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.495136 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptj59\" (UniqueName: \"kubernetes.io/projected/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-kube-api-access-ptj59\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.597548 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-ovsdbserver-sb\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.597911 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-config\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.598084 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-ovsdbserver-nb\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.598234 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-dns-svc\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.598339 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-dns-swift-storage-0\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.598445 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptj59\" (UniqueName: \"kubernetes.io/projected/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-kube-api-access-ptj59\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.598856 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-ovsdbserver-sb\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.598907 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-config\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.599547 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-ovsdbserver-nb\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.599551 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-dns-svc\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.600108 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-dns-swift-storage-0\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.618904 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptj59\" (UniqueName: \"kubernetes.io/projected/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-kube-api-access-ptj59\") pod \"dnsmasq-dns-6856c564b9-8cxgs\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:21 crc kubenswrapper[4563]: I1124 09:18:21.729998 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.014336 4563 generic.go:334] "Generic (PLEG): container finished" podID="d1190dd8-6aef-4116-be7f-e498cfe0db11" containerID="d3c46f2138d7e9e9519753124f1b46db0e7ec960bdf134ed0955e1c1a103c617" exitCode=0 Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.014513 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4w2gs" event={"ID":"d1190dd8-6aef-4116-be7f-e498cfe0db11","Type":"ContainerDied","Data":"d3c46f2138d7e9e9519753124f1b46db0e7ec960bdf134ed0955e1c1a103c617"} Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.017280 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.031054 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:22 crc kubenswrapper[4563]: E1124 09:18:22.047198 4563 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1190dd8_6aef_4116_be7f_e498cfe0db11.slice/crio-d3c46f2138d7e9e9519753124f1b46db0e7ec960bdf134ed0955e1c1a103c617.scope\": RecentStats: unable to find data in memory cache]" Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.107173 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6856c564b9-8cxgs"] Nov 24 09:18:22 crc kubenswrapper[4563]: W1124 09:18:22.108299 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6c54567_0ce3_4ce9_a2b9_30e16c7760bc.slice/crio-11eadec90d1894f725c8bb09b9f3ab86a6cec0d4f49dd2d5368b5dfbe90e50fd WatchSource:0}: Error finding container 11eadec90d1894f725c8bb09b9f3ab86a6cec0d4f49dd2d5368b5dfbe90e50fd: Status 404 returned error can't find the container with id 11eadec90d1894f725c8bb09b9f3ab86a6cec0d4f49dd2d5368b5dfbe90e50fd Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.109011 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-config\") pod \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.109163 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-dns-svc\") pod \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.109251 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch5dg\" (UniqueName: \"kubernetes.io/projected/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-kube-api-access-ch5dg\") pod \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.109345 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-ovsdbserver-nb\") pod \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.109417 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-ovsdbserver-sb\") pod \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\" (UID: \"9afed3b6-6377-4699-8ed3-ffbcff7b1c13\") " Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.110186 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9afed3b6-6377-4699-8ed3-ffbcff7b1c13" (UID: "9afed3b6-6377-4699-8ed3-ffbcff7b1c13"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.110216 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9afed3b6-6377-4699-8ed3-ffbcff7b1c13" (UID: "9afed3b6-6377-4699-8ed3-ffbcff7b1c13"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.110604 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-config" (OuterVolumeSpecName: "config") pod "9afed3b6-6377-4699-8ed3-ffbcff7b1c13" (UID: "9afed3b6-6377-4699-8ed3-ffbcff7b1c13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.111175 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9afed3b6-6377-4699-8ed3-ffbcff7b1c13" (UID: "9afed3b6-6377-4699-8ed3-ffbcff7b1c13"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.111605 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.111626 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.111655 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.111667 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.114595 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-kube-api-access-ch5dg" (OuterVolumeSpecName: "kube-api-access-ch5dg") pod "9afed3b6-6377-4699-8ed3-ffbcff7b1c13" (UID: "9afed3b6-6377-4699-8ed3-ffbcff7b1c13"). InnerVolumeSpecName "kube-api-access-ch5dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:22 crc kubenswrapper[4563]: I1124 09:18:22.213712 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch5dg\" (UniqueName: \"kubernetes.io/projected/9afed3b6-6377-4699-8ed3-ffbcff7b1c13-kube-api-access-ch5dg\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:23 crc kubenswrapper[4563]: I1124 09:18:23.023212 4563 generic.go:334] "Generic (PLEG): container finished" podID="b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" containerID="f020b5325ddf6f6e930c06a9f359eae126eebf2627cc81e3f18271b0687b3156" exitCode=0 Nov 24 09:18:23 crc kubenswrapper[4563]: I1124 09:18:23.023409 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:23 crc kubenswrapper[4563]: I1124 09:18:23.023402 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" event={"ID":"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc","Type":"ContainerDied","Data":"f020b5325ddf6f6e930c06a9f359eae126eebf2627cc81e3f18271b0687b3156"} Nov 24 09:18:23 crc kubenswrapper[4563]: I1124 09:18:23.023474 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" event={"ID":"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc","Type":"ContainerStarted","Data":"11eadec90d1894f725c8bb09b9f3ab86a6cec0d4f49dd2d5368b5dfbe90e50fd"} Nov 24 09:18:23 crc kubenswrapper[4563]: I1124 09:18:23.230860 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4w2gs" Nov 24 09:18:23 crc kubenswrapper[4563]: I1124 09:18:23.329525 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1190dd8-6aef-4116-be7f-e498cfe0db11-config-data\") pod \"d1190dd8-6aef-4116-be7f-e498cfe0db11\" (UID: \"d1190dd8-6aef-4116-be7f-e498cfe0db11\") " Nov 24 09:18:23 crc kubenswrapper[4563]: I1124 09:18:23.329826 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cznvl\" (UniqueName: \"kubernetes.io/projected/d1190dd8-6aef-4116-be7f-e498cfe0db11-kube-api-access-cznvl\") pod \"d1190dd8-6aef-4116-be7f-e498cfe0db11\" (UID: \"d1190dd8-6aef-4116-be7f-e498cfe0db11\") " Nov 24 09:18:23 crc kubenswrapper[4563]: I1124 09:18:23.329857 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1190dd8-6aef-4116-be7f-e498cfe0db11-combined-ca-bundle\") pod \"d1190dd8-6aef-4116-be7f-e498cfe0db11\" (UID: \"d1190dd8-6aef-4116-be7f-e498cfe0db11\") " Nov 24 09:18:23 crc kubenswrapper[4563]: I1124 09:18:23.335247 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1190dd8-6aef-4116-be7f-e498cfe0db11-kube-api-access-cznvl" (OuterVolumeSpecName: "kube-api-access-cznvl") pod "d1190dd8-6aef-4116-be7f-e498cfe0db11" (UID: "d1190dd8-6aef-4116-be7f-e498cfe0db11"). InnerVolumeSpecName "kube-api-access-cznvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:23 crc kubenswrapper[4563]: I1124 09:18:23.352060 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1190dd8-6aef-4116-be7f-e498cfe0db11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1190dd8-6aef-4116-be7f-e498cfe0db11" (UID: "d1190dd8-6aef-4116-be7f-e498cfe0db11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:23 crc kubenswrapper[4563]: I1124 09:18:23.367206 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1190dd8-6aef-4116-be7f-e498cfe0db11-config-data" (OuterVolumeSpecName: "config-data") pod "d1190dd8-6aef-4116-be7f-e498cfe0db11" (UID: "d1190dd8-6aef-4116-be7f-e498cfe0db11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:23 crc kubenswrapper[4563]: I1124 09:18:23.431351 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cznvl\" (UniqueName: \"kubernetes.io/projected/d1190dd8-6aef-4116-be7f-e498cfe0db11-kube-api-access-cznvl\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:23 crc kubenswrapper[4563]: I1124 09:18:23.431380 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1190dd8-6aef-4116-be7f-e498cfe0db11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:23 crc kubenswrapper[4563]: I1124 09:18:23.431394 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1190dd8-6aef-4116-be7f-e498cfe0db11-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.031510 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4w2gs" event={"ID":"d1190dd8-6aef-4116-be7f-e498cfe0db11","Type":"ContainerDied","Data":"1c0060a633c96b076cb9e07e135404adab02bb4bc710233ec50afe9829775e97"} Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.031798 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c0060a633c96b076cb9e07e135404adab02bb4bc710233ec50afe9829775e97" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.031577 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4w2gs" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.035127 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" event={"ID":"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc","Type":"ContainerStarted","Data":"97c9fa9d60e8d8a75bc153f26d281fff59c9fb9a21a3a15ab4df31a3c3276851"} Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.035282 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.056103 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" podStartSLOduration=3.056088054 podStartE2EDuration="3.056088054s" podCreationTimestamp="2025-11-24 09:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:24.050584803 +0000 UTC m=+881.309562250" watchObservedRunningTime="2025-11-24 09:18:24.056088054 +0000 UTC m=+881.315065501" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.212723 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6856c564b9-8cxgs"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.229355 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7dbf8bff67-72zrm"] Nov 24 09:18:24 crc kubenswrapper[4563]: E1124 09:18:24.229632 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1190dd8-6aef-4116-be7f-e498cfe0db11" containerName="keystone-db-sync" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.229663 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1190dd8-6aef-4116-be7f-e498cfe0db11" containerName="keystone-db-sync" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.229834 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1190dd8-6aef-4116-be7f-e498cfe0db11" containerName="keystone-db-sync" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.230529 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.262511 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dbf8bff67-72zrm"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.305629 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dxkdq"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.309788 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.316115 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.316181 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.316122 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.316379 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.324045 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dxkdq"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.329942 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sw8dk" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.352515 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-config\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.352682 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-ovsdbserver-sb\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.352858 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-dns-svc\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.352893 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-ovsdbserver-nb\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.352937 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8cl4\" (UniqueName: \"kubernetes.io/projected/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-kube-api-access-g8cl4\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.352995 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-dns-swift-storage-0\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.435006 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-895bd5bbf-mc6v8"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.436672 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.451445 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.451632 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.451893 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-vbkrx" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.452077 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.454328 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-scripts\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.454375 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-fernet-keys\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.454418 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-ovsdbserver-sb\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.454468 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-credential-keys\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.454532 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-dns-svc\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.454560 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-ovsdbserver-nb\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.454587 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m8m5\" (UniqueName: \"kubernetes.io/projected/6e3d5e6c-5414-4094-8673-4d2626067ce8-kube-api-access-4m8m5\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.454612 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8cl4\" (UniqueName: \"kubernetes.io/projected/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-kube-api-access-g8cl4\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.454675 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-config-data\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.454726 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-dns-swift-storage-0\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.454820 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-config\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.454859 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-combined-ca-bundle\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.455783 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-895bd5bbf-mc6v8"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.456469 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-dns-svc\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.457692 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-ovsdbserver-sb\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.457760 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-dns-swift-storage-0\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.458075 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-config\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.461978 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-ovsdbserver-nb\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.506934 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.509468 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.518034 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.518186 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.518678 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8cl4\" (UniqueName: \"kubernetes.io/projected/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-kube-api-access-g8cl4\") pod \"dnsmasq-dns-7dbf8bff67-72zrm\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.535901 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.548193 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.555866 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-combined-ca-bundle\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.555920 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-scripts\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.555951 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-fernet-keys\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.555983 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq4pl\" (UniqueName: \"kubernetes.io/projected/38e4b9dc-4234-4602-9111-514d6d94e10b-kube-api-access-bq4pl\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.556022 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-credential-keys\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.556059 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38e4b9dc-4234-4602-9111-514d6d94e10b-config-data\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.556075 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/38e4b9dc-4234-4602-9111-514d6d94e10b-horizon-secret-key\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.556094 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38e4b9dc-4234-4602-9111-514d6d94e10b-logs\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.556118 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m8m5\" (UniqueName: \"kubernetes.io/projected/6e3d5e6c-5414-4094-8673-4d2626067ce8-kube-api-access-4m8m5\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.556136 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38e4b9dc-4234-4602-9111-514d6d94e10b-scripts\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.556165 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-config-data\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.561478 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-config-data\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.582974 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-combined-ca-bundle\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.587923 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-fernet-keys\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.591441 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-credential-keys\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.626437 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-ngwrz"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.627279 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-scripts\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.627969 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ngwrz" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.630815 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m8m5\" (UniqueName: \"kubernetes.io/projected/6e3d5e6c-5414-4094-8673-4d2626067ce8-kube-api-access-4m8m5\") pod \"keystone-bootstrap-dxkdq\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.655101 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-56mjh" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.655294 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.655415 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.655813 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.657357 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b995b5b3-41b0-4334-9f7c-792a50e780e7-run-httpd\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.657418 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-config-data\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.657475 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq4pl\" (UniqueName: \"kubernetes.io/projected/38e4b9dc-4234-4602-9111-514d6d94e10b-kube-api-access-bq4pl\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.657506 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-scripts\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.657523 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.657556 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38e4b9dc-4234-4602-9111-514d6d94e10b-config-data\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.657571 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/38e4b9dc-4234-4602-9111-514d6d94e10b-horizon-secret-key\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.657590 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38e4b9dc-4234-4602-9111-514d6d94e10b-logs\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.657605 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b995b5b3-41b0-4334-9f7c-792a50e780e7-log-httpd\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.657624 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plvfd\" (UniqueName: \"kubernetes.io/projected/b995b5b3-41b0-4334-9f7c-792a50e780e7-kube-api-access-plvfd\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.660858 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38e4b9dc-4234-4602-9111-514d6d94e10b-scripts\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.660974 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.662794 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38e4b9dc-4234-4602-9111-514d6d94e10b-config-data\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.663002 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38e4b9dc-4234-4602-9111-514d6d94e10b-logs\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.663070 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ngwrz"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.664130 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38e4b9dc-4234-4602-9111-514d6d94e10b-scripts\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.684061 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/38e4b9dc-4234-4602-9111-514d6d94e10b-horizon-secret-key\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.698691 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-stk25"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.699972 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.722607 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.723061 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.723252 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-llspk" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.723253 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq4pl\" (UniqueName: \"kubernetes.io/projected/38e4b9dc-4234-4602-9111-514d6d94e10b-kube-api-access-bq4pl\") pod \"horizon-895bd5bbf-mc6v8\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.723564 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5ffd6f76f7-4hmjv"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.724883 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.725764 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.726019 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.754455 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.754655 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-c5b44" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.754802 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.754910 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.770514 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-stk25"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.770914 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772136 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-scripts\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772166 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b0793e21-229f-415e-8b3e-1499e1ed3bf6-etc-machine-id\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772211 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-scripts\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772232 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772266 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-config\") pod \"neutron-db-sync-ngwrz\" (UID: \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\") " pod="openstack/neutron-db-sync-ngwrz" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772294 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnr86\" (UniqueName: \"kubernetes.io/projected/b0793e21-229f-415e-8b3e-1499e1ed3bf6-kube-api-access-wnr86\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772318 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-combined-ca-bundle\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772334 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b995b5b3-41b0-4334-9f7c-792a50e780e7-log-httpd\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772351 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plvfd\" (UniqueName: \"kubernetes.io/projected/b995b5b3-41b0-4334-9f7c-792a50e780e7-kube-api-access-plvfd\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772377 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-config-data\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772395 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p8ls\" (UniqueName: \"kubernetes.io/projected/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-kube-api-access-8p8ls\") pod \"neutron-db-sync-ngwrz\" (UID: \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\") " pod="openstack/neutron-db-sync-ngwrz" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772423 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-combined-ca-bundle\") pod \"neutron-db-sync-ngwrz\" (UID: \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\") " pod="openstack/neutron-db-sync-ngwrz" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772449 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772485 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b995b5b3-41b0-4334-9f7c-792a50e780e7-run-httpd\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772506 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-db-sync-config-data\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.772536 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-config-data\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.782102 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b995b5b3-41b0-4334-9f7c-792a50e780e7-log-httpd\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.783726 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5ffd6f76f7-4hmjv"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.784054 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b995b5b3-41b0-4334-9f7c-792a50e780e7-run-httpd\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.796974 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-config-data\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.802386 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.806047 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-scripts\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.813386 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.817917 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-frjwb"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.819080 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-frjwb" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.834151 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.834437 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hv8j4" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.845580 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dbf8bff67-72zrm"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.862361 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plvfd\" (UniqueName: \"kubernetes.io/projected/b995b5b3-41b0-4334-9f7c-792a50e780e7-kube-api-access-plvfd\") pod \"ceilometer-0\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " pod="openstack/ceilometer-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.875799 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-db-sync-config-data\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.875861 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0edbf62a-a3fc-416a-a94d-395da81b7b63-scripts\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.875895 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.875957 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.875986 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edbf62a-a3fc-416a-a94d-395da81b7b63-logs\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876023 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-scripts\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876045 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b0793e21-229f-415e-8b3e-1499e1ed3bf6-etc-machine-id\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876067 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876101 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876128 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-454vg\" (UniqueName: \"kubernetes.io/projected/bd913ad3-2f58-4623-9c48-bde74b395f3f-kube-api-access-454vg\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876161 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxxm9\" (UniqueName: \"kubernetes.io/projected/0edbf62a-a3fc-416a-a94d-395da81b7b63-kube-api-access-zxxm9\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876185 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-config\") pod \"neutron-db-sync-ngwrz\" (UID: \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\") " pod="openstack/neutron-db-sync-ngwrz" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876218 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd913ad3-2f58-4623-9c48-bde74b395f3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876237 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnr86\" (UniqueName: \"kubernetes.io/projected/b0793e21-229f-415e-8b3e-1499e1ed3bf6-kube-api-access-wnr86\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876256 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-combined-ca-bundle\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876286 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876306 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-config-data\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876331 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0edbf62a-a3fc-416a-a94d-395da81b7b63-config-data\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876349 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p8ls\" (UniqueName: \"kubernetes.io/projected/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-kube-api-access-8p8ls\") pod \"neutron-db-sync-ngwrz\" (UID: \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\") " pod="openstack/neutron-db-sync-ngwrz" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876386 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-combined-ca-bundle\") pod \"neutron-db-sync-ngwrz\" (UID: \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\") " pod="openstack/neutron-db-sync-ngwrz" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876407 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd913ad3-2f58-4623-9c48-bde74b395f3f-logs\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.876428 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0edbf62a-a3fc-416a-a94d-395da81b7b63-horizon-secret-key\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.890108 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-scripts\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.890178 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b0793e21-229f-415e-8b3e-1499e1ed3bf6-etc-machine-id\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.904149 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-combined-ca-bundle\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.912238 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-config\") pod \"neutron-db-sync-ngwrz\" (UID: \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\") " pod="openstack/neutron-db-sync-ngwrz" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.917098 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-combined-ca-bundle\") pod \"neutron-db-sync-ngwrz\" (UID: \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\") " pod="openstack/neutron-db-sync-ngwrz" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.918391 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-db-sync-config-data\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.926167 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.935876 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnr86\" (UniqueName: \"kubernetes.io/projected/b0793e21-229f-415e-8b3e-1499e1ed3bf6-kube-api-access-wnr86\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.941054 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-config-data\") pod \"cinder-db-sync-stk25\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.946117 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p8ls\" (UniqueName: \"kubernetes.io/projected/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-kube-api-access-8p8ls\") pod \"neutron-db-sync-ngwrz\" (UID: \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\") " pod="openstack/neutron-db-sync-ngwrz" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.954063 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-frjwb"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.969386 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-stk25" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.969552 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76c58b6d97-qn6g9"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.971618 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.978853 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0edbf62a-a3fc-416a-a94d-395da81b7b63-scripts\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.979053 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.979176 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.979278 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edbf62a-a3fc-416a-a94d-395da81b7b63-logs\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.979378 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-combined-ca-bundle\") pod \"barbican-db-sync-frjwb\" (UID: \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\") " pod="openstack/barbican-db-sync-frjwb" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.979500 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.979599 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.979703 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-454vg\" (UniqueName: \"kubernetes.io/projected/bd913ad3-2f58-4623-9c48-bde74b395f3f-kube-api-access-454vg\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.979810 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjgcb\" (UniqueName: \"kubernetes.io/projected/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-kube-api-access-tjgcb\") pod \"barbican-db-sync-frjwb\" (UID: \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\") " pod="openstack/barbican-db-sync-frjwb" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.979908 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxxm9\" (UniqueName: \"kubernetes.io/projected/0edbf62a-a3fc-416a-a94d-395da81b7b63-kube-api-access-zxxm9\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.980023 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd913ad3-2f58-4623-9c48-bde74b395f3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.980119 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-db-sync-config-data\") pod \"barbican-db-sync-frjwb\" (UID: \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\") " pod="openstack/barbican-db-sync-frjwb" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.980225 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.980328 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0edbf62a-a3fc-416a-a94d-395da81b7b63-config-data\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.980452 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd913ad3-2f58-4623-9c48-bde74b395f3f-logs\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.980549 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0edbf62a-a3fc-416a-a94d-395da81b7b63-horizon-secret-key\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.980568 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.990584 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0edbf62a-a3fc-416a-a94d-395da81b7b63-scripts\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.991862 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0edbf62a-a3fc-416a-a94d-395da81b7b63-config-data\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.996293 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd913ad3-2f58-4623-9c48-bde74b395f3f-logs\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.999057 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd913ad3-2f58-4623-9c48-bde74b395f3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.987594 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76c58b6d97-qn6g9"] Nov 24 09:18:24 crc kubenswrapper[4563]: I1124 09:18:24.999933 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8klw7"] Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.001068 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.006133 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edbf62a-a3fc-416a-a94d-395da81b7b63-logs\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.006765 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8g5hg" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.006838 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.007076 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.061461 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.068570 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.068949 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0edbf62a-a3fc-416a-a94d-395da81b7b63-horizon-secret-key\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.069506 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.075440 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-454vg\" (UniqueName: \"kubernetes.io/projected/bd913ad3-2f58-4623-9c48-bde74b395f3f-kube-api-access-454vg\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.076097 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.088013 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxxm9\" (UniqueName: \"kubernetes.io/projected/0edbf62a-a3fc-416a-a94d-395da81b7b63-kube-api-access-zxxm9\") pod \"horizon-5ffd6f76f7-4hmjv\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.098587 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-db-sync-config-data\") pod \"barbican-db-sync-frjwb\" (UID: \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\") " pod="openstack/barbican-db-sync-frjwb" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.098897 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-combined-ca-bundle\") pod \"barbican-db-sync-frjwb\" (UID: \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\") " pod="openstack/barbican-db-sync-frjwb" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.099019 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjgcb\" (UniqueName: \"kubernetes.io/projected/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-kube-api-access-tjgcb\") pod \"barbican-db-sync-frjwb\" (UID: \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\") " pod="openstack/barbican-db-sync-frjwb" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.123934 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-db-sync-config-data\") pod \"barbican-db-sync-frjwb\" (UID: \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\") " pod="openstack/barbican-db-sync-frjwb" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.126008 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.142159 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.144691 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjgcb\" (UniqueName: \"kubernetes.io/projected/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-kube-api-access-tjgcb\") pod \"barbican-db-sync-frjwb\" (UID: \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\") " pod="openstack/barbican-db-sync-frjwb" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.154411 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-combined-ca-bundle\") pod \"barbican-db-sync-frjwb\" (UID: \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\") " pod="openstack/barbican-db-sync-frjwb" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.155847 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8klw7"] Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.155886 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.170037 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.170182 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.174162 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.179724 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.201893 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-combined-ca-bundle\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.203343 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npgkh\" (UniqueName: \"kubernetes.io/projected/d803d6ca-646a-4dd5-93ef-d096b501c28a-kube-api-access-npgkh\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.203418 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhm69\" (UniqueName: \"kubernetes.io/projected/f2330283-b2b3-4fe5-9924-bc2f48c08497-kube-api-access-dhm69\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.203519 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-ovsdbserver-nb\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.203555 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d803d6ca-646a-4dd5-93ef-d096b501c28a-logs\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.203585 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-dns-svc\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.203625 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-config\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.203663 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-dns-swift-storage-0\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.203817 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-ovsdbserver-sb\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.203905 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-scripts\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.204021 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-config-data\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.219732 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ngwrz" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.290558 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.306168 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.306236 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-scripts\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.306344 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-config-data\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.310478 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-combined-ca-bundle\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.310611 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.310678 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.310726 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.310902 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npgkh\" (UniqueName: \"kubernetes.io/projected/d803d6ca-646a-4dd5-93ef-d096b501c28a-kube-api-access-npgkh\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.310965 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhm69\" (UniqueName: \"kubernetes.io/projected/f2330283-b2b3-4fe5-9924-bc2f48c08497-kube-api-access-dhm69\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.311052 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-ovsdbserver-nb\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.311075 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmrkt\" (UniqueName: \"kubernetes.io/projected/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-kube-api-access-pmrkt\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.311102 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d803d6ca-646a-4dd5-93ef-d096b501c28a-logs\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.311141 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-dns-svc\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.311174 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.311194 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-config\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.311216 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-dns-swift-storage-0\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.311232 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.311303 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.311348 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-ovsdbserver-sb\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.312233 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-ovsdbserver-sb\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.313798 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-ovsdbserver-nb\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.313998 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-combined-ca-bundle\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.314739 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-dns-swift-storage-0\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.315383 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-dns-svc\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.315669 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-config\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.315729 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d803d6ca-646a-4dd5-93ef-d096b501c28a-logs\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.315864 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-scripts\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.316052 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-config-data\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.327597 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npgkh\" (UniqueName: \"kubernetes.io/projected/d803d6ca-646a-4dd5-93ef-d096b501c28a-kube-api-access-npgkh\") pod \"placement-db-sync-8klw7\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.327707 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhm69\" (UniqueName: \"kubernetes.io/projected/f2330283-b2b3-4fe5-9924-bc2f48c08497-kube-api-access-dhm69\") pod \"dnsmasq-dns-76c58b6d97-qn6g9\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.381166 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.413332 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.413373 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.413412 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.413443 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.413493 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.413510 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.413529 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.413598 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmrkt\" (UniqueName: \"kubernetes.io/projected/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-kube-api-access-pmrkt\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.414359 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.414774 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.415534 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.419218 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.421456 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.421516 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.422000 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.430864 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmrkt\" (UniqueName: \"kubernetes.io/projected/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-kube-api-access-pmrkt\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.435184 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.441491 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-frjwb" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.466809 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dxkdq"] Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.472385 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.476207 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8klw7" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.541751 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.620130 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-895bd5bbf-mc6v8"] Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.632699 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-stk25"] Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.637934 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dbf8bff67-72zrm"] Nov 24 09:18:25 crc kubenswrapper[4563]: W1124 09:18:25.672742 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38e4b9dc_4234_4602_9111_514d6d94e10b.slice/crio-e0b58534c12d604038f6d0afadd31cac9767018079a7a2aa3ca1db14d4a2c559 WatchSource:0}: Error finding container e0b58534c12d604038f6d0afadd31cac9767018079a7a2aa3ca1db14d4a2c559: Status 404 returned error can't find the container with id e0b58534c12d604038f6d0afadd31cac9767018079a7a2aa3ca1db14d4a2c559 Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.767707 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.845617 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ngwrz"] Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.855190 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-frjwb"] Nov 24 09:18:25 crc kubenswrapper[4563]: W1124 09:18:25.870222 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded99b33d_c985_44bf_9a4a_b9f93bf3927f.slice/crio-7b97aaa473874328727049b0f2899fb42ed3527173e5d4c2f6250f1bfa1266b6 WatchSource:0}: Error finding container 7b97aaa473874328727049b0f2899fb42ed3527173e5d4c2f6250f1bfa1266b6: Status 404 returned error can't find the container with id 7b97aaa473874328727049b0f2899fb42ed3527173e5d4c2f6250f1bfa1266b6 Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.928962 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:25 crc kubenswrapper[4563]: I1124 09:18:25.935227 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5ffd6f76f7-4hmjv"] Nov 24 09:18:25 crc kubenswrapper[4563]: W1124 09:18:25.941318 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0edbf62a_a3fc_416a_a94d_395da81b7b63.slice/crio-e6bb6393ae9d4a041574d550aae78f91030f528cd1ba4a9894b25b3427b41bd9 WatchSource:0}: Error finding container e6bb6393ae9d4a041574d550aae78f91030f528cd1ba4a9894b25b3427b41bd9: Status 404 returned error can't find the container with id e6bb6393ae9d4a041574d550aae78f91030f528cd1ba4a9894b25b3427b41bd9 Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.084451 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76c58b6d97-qn6g9"] Nov 24 09:18:26 crc kubenswrapper[4563]: W1124 09:18:26.104912 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2330283_b2b3_4fe5_9924_bc2f48c08497.slice/crio-93264d69bdf6a0760e58017a2f90b805fc41f30c35f45fb27de38d97c191cc88 WatchSource:0}: Error finding container 93264d69bdf6a0760e58017a2f90b805fc41f30c35f45fb27de38d97c191cc88: Status 404 returned error can't find the container with id 93264d69bdf6a0760e58017a2f90b805fc41f30c35f45fb27de38d97c191cc88 Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.116841 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8klw7"] Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.122023 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" event={"ID":"f2330283-b2b3-4fe5-9924-bc2f48c08497","Type":"ContainerStarted","Data":"93264d69bdf6a0760e58017a2f90b805fc41f30c35f45fb27de38d97c191cc88"} Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.124731 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-frjwb" event={"ID":"ed99b33d-c985-44bf-9a4a-b9f93bf3927f","Type":"ContainerStarted","Data":"7b97aaa473874328727049b0f2899fb42ed3527173e5d4c2f6250f1bfa1266b6"} Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.126186 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b995b5b3-41b0-4334-9f7c-792a50e780e7","Type":"ContainerStarted","Data":"395027ed71257e44b947a95a94e94bfd854681ce9148cbec23cb43999743c9a0"} Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.128072 4563 generic.go:334] "Generic (PLEG): container finished" podID="2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b" containerID="5fab90a41f8a498a1f06c9cdd0ed7f416107957dd420a3eee55eb2782edaef46" exitCode=0 Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.128141 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" event={"ID":"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b","Type":"ContainerDied","Data":"5fab90a41f8a498a1f06c9cdd0ed7f416107957dd420a3eee55eb2782edaef46"} Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.128166 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" event={"ID":"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b","Type":"ContainerStarted","Data":"b5c622c8f65e29323ea9f378e2facd4dedbcd62abba21cc3f3d718fce671fabd"} Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.130657 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dxkdq" event={"ID":"6e3d5e6c-5414-4094-8673-4d2626067ce8","Type":"ContainerStarted","Data":"30a4d5df861b3290bb9630c2ee99164697fb288bb158d589ff153afed7599d4d"} Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.130691 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dxkdq" event={"ID":"6e3d5e6c-5414-4094-8673-4d2626067ce8","Type":"ContainerStarted","Data":"90a7ebe4d7ae0619fa9e6d41bcfddc48146b62cf9629e517aa0dd8f8e0e0d1b6"} Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.136383 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-895bd5bbf-mc6v8" event={"ID":"38e4b9dc-4234-4602-9111-514d6d94e10b","Type":"ContainerStarted","Data":"e0b58534c12d604038f6d0afadd31cac9767018079a7a2aa3ca1db14d4a2c559"} Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.140843 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ngwrz" event={"ID":"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6","Type":"ContainerStarted","Data":"cd9a6e0bb9cf33eeb070fb668ea6ead3754b52490ba66ba3b99f0717f840c0b9"} Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.140871 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ngwrz" event={"ID":"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6","Type":"ContainerStarted","Data":"f60033dc81098f9cf3acee6f0c67e4ef3ed6d30b9886b9edab123060ff055e53"} Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.142529 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-stk25" event={"ID":"b0793e21-229f-415e-8b3e-1499e1ed3bf6","Type":"ContainerStarted","Data":"a959c89158ac6d8363a77fa25e41b421a80c8d98331aa61e9bf5e15b6c0e6f5c"} Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.144904 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd913ad3-2f58-4623-9c48-bde74b395f3f","Type":"ContainerStarted","Data":"79e6defd1d263ee36f5fb060bd5ca2d390179352fb7e916bb28e438134824392"} Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.151233 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5ffd6f76f7-4hmjv" event={"ID":"0edbf62a-a3fc-416a-a94d-395da81b7b63","Type":"ContainerStarted","Data":"e6bb6393ae9d4a041574d550aae78f91030f528cd1ba4a9894b25b3427b41bd9"} Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.151359 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" podUID="b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" containerName="dnsmasq-dns" containerID="cri-o://97c9fa9d60e8d8a75bc153f26d281fff59c9fb9a21a3a15ab4df31a3c3276851" gracePeriod=10 Nov 24 09:18:26 crc kubenswrapper[4563]: W1124 09:18:26.166415 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd803d6ca_646a_4dd5_93ef_d096b501c28a.slice/crio-91de1d918813f67bcad3b84860039b1937ba4b42fd6e46c790398237d4bf71a2 WatchSource:0}: Error finding container 91de1d918813f67bcad3b84860039b1937ba4b42fd6e46c790398237d4bf71a2: Status 404 returned error can't find the container with id 91de1d918813f67bcad3b84860039b1937ba4b42fd6e46c790398237d4bf71a2 Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.170407 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-ngwrz" podStartSLOduration=2.1703917759999998 podStartE2EDuration="2.170391776s" podCreationTimestamp="2025-11-24 09:18:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:26.155585838 +0000 UTC m=+883.414563285" watchObservedRunningTime="2025-11-24 09:18:26.170391776 +0000 UTC m=+883.429369223" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.175215 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dxkdq" podStartSLOduration=2.175207148 podStartE2EDuration="2.175207148s" podCreationTimestamp="2025-11-24 09:18:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:26.170470494 +0000 UTC m=+883.429447941" watchObservedRunningTime="2025-11-24 09:18:26.175207148 +0000 UTC m=+883.434184595" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.273683 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:18:26 crc kubenswrapper[4563]: W1124 09:18:26.293549 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef7d1434_5a60_4b4b_8ad7_ac8e32ebd5ab.slice/crio-a3eb24d9d3a77b7a2de2449ca3c14fc4753de383a772384c64af3a5af38fd5ce WatchSource:0}: Error finding container a3eb24d9d3a77b7a2de2449ca3c14fc4753de383a772384c64af3a5af38fd5ce: Status 404 returned error can't find the container with id a3eb24d9d3a77b7a2de2449ca3c14fc4753de383a772384c64af3a5af38fd5ce Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.441065 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.536100 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.546742 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-dns-swift-storage-0\") pod \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.546869 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-dns-svc\") pod \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.546899 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-ovsdbserver-nb\") pod \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.546984 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-config\") pod \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.547050 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-ovsdbserver-sb\") pod \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.547149 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8cl4\" (UniqueName: \"kubernetes.io/projected/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-kube-api-access-g8cl4\") pod \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\" (UID: \"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b\") " Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.557463 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-kube-api-access-g8cl4" (OuterVolumeSpecName: "kube-api-access-g8cl4") pod "2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b" (UID: "2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b"). InnerVolumeSpecName "kube-api-access-g8cl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.582514 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-config" (OuterVolumeSpecName: "config") pod "2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b" (UID: "2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.585787 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b" (UID: "2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.592999 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b" (UID: "2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.610165 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b" (UID: "2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.613239 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b" (UID: "2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.650287 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-ovsdbserver-sb\") pod \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.650417 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-ovsdbserver-nb\") pod \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.650436 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-dns-swift-storage-0\") pod \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.650468 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-dns-svc\") pod \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.650511 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptj59\" (UniqueName: \"kubernetes.io/projected/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-kube-api-access-ptj59\") pod \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.650573 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-config\") pod \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\" (UID: \"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc\") " Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.651661 4563 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.651691 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.651702 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.651712 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.651723 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.651732 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8cl4\" (UniqueName: \"kubernetes.io/projected/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b-kube-api-access-g8cl4\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.668093 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-kube-api-access-ptj59" (OuterVolumeSpecName: "kube-api-access-ptj59") pod "b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" (UID: "b6c54567-0ce3-4ce9-a2b9-30e16c7760bc"). InnerVolumeSpecName "kube-api-access-ptj59". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.712267 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" (UID: "b6c54567-0ce3-4ce9-a2b9-30e16c7760bc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.714116 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" (UID: "b6c54567-0ce3-4ce9-a2b9-30e16c7760bc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.716937 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" (UID: "b6c54567-0ce3-4ce9-a2b9-30e16c7760bc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.723456 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" (UID: "b6c54567-0ce3-4ce9-a2b9-30e16c7760bc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.724564 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-config" (OuterVolumeSpecName: "config") pod "b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" (UID: "b6c54567-0ce3-4ce9-a2b9-30e16c7760bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.756738 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.756817 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.756850 4563 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.756861 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.756873 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:26 crc kubenswrapper[4563]: I1124 09:18:26.756883 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptj59\" (UniqueName: \"kubernetes.io/projected/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc-kube-api-access-ptj59\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.183468 4563 generic.go:334] "Generic (PLEG): container finished" podID="f2330283-b2b3-4fe5-9924-bc2f48c08497" containerID="cd9a2e6efb8b95d3e0180b68415a21edfdfb8771f9b1c7b43b95815c58884285" exitCode=0 Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.185230 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" event={"ID":"f2330283-b2b3-4fe5-9924-bc2f48c08497","Type":"ContainerDied","Data":"cd9a2e6efb8b95d3e0180b68415a21edfdfb8771f9b1c7b43b95815c58884285"} Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.220759 4563 generic.go:334] "Generic (PLEG): container finished" podID="b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" containerID="97c9fa9d60e8d8a75bc153f26d281fff59c9fb9a21a3a15ab4df31a3c3276851" exitCode=0 Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.220919 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" event={"ID":"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc","Type":"ContainerDied","Data":"97c9fa9d60e8d8a75bc153f26d281fff59c9fb9a21a3a15ab4df31a3c3276851"} Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.220963 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" event={"ID":"b6c54567-0ce3-4ce9-a2b9-30e16c7760bc","Type":"ContainerDied","Data":"11eadec90d1894f725c8bb09b9f3ab86a6cec0d4f49dd2d5368b5dfbe90e50fd"} Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.220984 4563 scope.go:117] "RemoveContainer" containerID="97c9fa9d60e8d8a75bc153f26d281fff59c9fb9a21a3a15ab4df31a3c3276851" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.223382 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6856c564b9-8cxgs" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.224893 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.226005 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dbf8bff67-72zrm" event={"ID":"2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b","Type":"ContainerDied","Data":"b5c622c8f65e29323ea9f378e2facd4dedbcd62abba21cc3f3d718fce671fabd"} Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.234378 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab","Type":"ContainerStarted","Data":"4c4adf3fda70bd0403946d2ea1fe2270168e48af824d9046727bde6eaa33062b"} Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.234435 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab","Type":"ContainerStarted","Data":"a3eb24d9d3a77b7a2de2449ca3c14fc4753de383a772384c64af3a5af38fd5ce"} Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.273874 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6856c564b9-8cxgs"] Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.274003 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8klw7" event={"ID":"d803d6ca-646a-4dd5-93ef-d096b501c28a","Type":"ContainerStarted","Data":"91de1d918813f67bcad3b84860039b1937ba4b42fd6e46c790398237d4bf71a2"} Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.277255 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6856c564b9-8cxgs"] Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.280258 4563 scope.go:117] "RemoveContainer" containerID="f020b5325ddf6f6e930c06a9f359eae126eebf2627cc81e3f18271b0687b3156" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.285433 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd913ad3-2f58-4623-9c48-bde74b395f3f","Type":"ContainerStarted","Data":"29d4e65d72ef44b25bf575ada96dff280db99f3561ba615e5ed9e208bd1a5e91"} Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.327003 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.337509 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dbf8bff67-72zrm"] Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.352595 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7dbf8bff67-72zrm"] Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.364409 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5ffd6f76f7-4hmjv"] Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.433731 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.441931 4563 scope.go:117] "RemoveContainer" containerID="97c9fa9d60e8d8a75bc153f26d281fff59c9fb9a21a3a15ab4df31a3c3276851" Nov 24 09:18:27 crc kubenswrapper[4563]: E1124 09:18:27.442591 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97c9fa9d60e8d8a75bc153f26d281fff59c9fb9a21a3a15ab4df31a3c3276851\": container with ID starting with 97c9fa9d60e8d8a75bc153f26d281fff59c9fb9a21a3a15ab4df31a3c3276851 not found: ID does not exist" containerID="97c9fa9d60e8d8a75bc153f26d281fff59c9fb9a21a3a15ab4df31a3c3276851" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.442631 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97c9fa9d60e8d8a75bc153f26d281fff59c9fb9a21a3a15ab4df31a3c3276851"} err="failed to get container status \"97c9fa9d60e8d8a75bc153f26d281fff59c9fb9a21a3a15ab4df31a3c3276851\": rpc error: code = NotFound desc = could not find container \"97c9fa9d60e8d8a75bc153f26d281fff59c9fb9a21a3a15ab4df31a3c3276851\": container with ID starting with 97c9fa9d60e8d8a75bc153f26d281fff59c9fb9a21a3a15ab4df31a3c3276851 not found: ID does not exist" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.442666 4563 scope.go:117] "RemoveContainer" containerID="f020b5325ddf6f6e930c06a9f359eae126eebf2627cc81e3f18271b0687b3156" Nov 24 09:18:27 crc kubenswrapper[4563]: E1124 09:18:27.444205 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f020b5325ddf6f6e930c06a9f359eae126eebf2627cc81e3f18271b0687b3156\": container with ID starting with f020b5325ddf6f6e930c06a9f359eae126eebf2627cc81e3f18271b0687b3156 not found: ID does not exist" containerID="f020b5325ddf6f6e930c06a9f359eae126eebf2627cc81e3f18271b0687b3156" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.444238 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f020b5325ddf6f6e930c06a9f359eae126eebf2627cc81e3f18271b0687b3156"} err="failed to get container status \"f020b5325ddf6f6e930c06a9f359eae126eebf2627cc81e3f18271b0687b3156\": rpc error: code = NotFound desc = could not find container \"f020b5325ddf6f6e930c06a9f359eae126eebf2627cc81e3f18271b0687b3156\": container with ID starting with f020b5325ddf6f6e930c06a9f359eae126eebf2627cc81e3f18271b0687b3156 not found: ID does not exist" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.444253 4563 scope.go:117] "RemoveContainer" containerID="5fab90a41f8a498a1f06c9cdd0ed7f416107957dd420a3eee55eb2782edaef46" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.469060 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.479680 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-86c668996f-cnx27"] Nov 24 09:18:27 crc kubenswrapper[4563]: E1124 09:18:27.480227 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" containerName="dnsmasq-dns" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.480240 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" containerName="dnsmasq-dns" Nov 24 09:18:27 crc kubenswrapper[4563]: E1124 09:18:27.480259 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" containerName="init" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.480264 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" containerName="init" Nov 24 09:18:27 crc kubenswrapper[4563]: E1124 09:18:27.480277 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b" containerName="init" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.480282 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b" containerName="init" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.480476 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b" containerName="init" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.480492 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" containerName="dnsmasq-dns" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.481367 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.486881 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86c668996f-cnx27"] Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.584941 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6fe7506-2be4-49fb-9295-bf5960b88baf-horizon-secret-key\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.585024 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6fe7506-2be4-49fb-9295-bf5960b88baf-logs\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.585100 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6fe7506-2be4-49fb-9295-bf5960b88baf-scripts\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.585158 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6fe7506-2be4-49fb-9295-bf5960b88baf-config-data\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.585190 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68lxl\" (UniqueName: \"kubernetes.io/projected/e6fe7506-2be4-49fb-9295-bf5960b88baf-kube-api-access-68lxl\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.687560 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6fe7506-2be4-49fb-9295-bf5960b88baf-scripts\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.687673 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6fe7506-2be4-49fb-9295-bf5960b88baf-config-data\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.687735 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68lxl\" (UniqueName: \"kubernetes.io/projected/e6fe7506-2be4-49fb-9295-bf5960b88baf-kube-api-access-68lxl\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.688137 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6fe7506-2be4-49fb-9295-bf5960b88baf-horizon-secret-key\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.688229 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6fe7506-2be4-49fb-9295-bf5960b88baf-logs\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.688806 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6fe7506-2be4-49fb-9295-bf5960b88baf-logs\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.689496 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6fe7506-2be4-49fb-9295-bf5960b88baf-scripts\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.690503 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6fe7506-2be4-49fb-9295-bf5960b88baf-config-data\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.700375 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6fe7506-2be4-49fb-9295-bf5960b88baf-horizon-secret-key\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.708614 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68lxl\" (UniqueName: \"kubernetes.io/projected/e6fe7506-2be4-49fb-9295-bf5960b88baf-kube-api-access-68lxl\") pod \"horizon-86c668996f-cnx27\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:27 crc kubenswrapper[4563]: I1124 09:18:27.811200 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:28 crc kubenswrapper[4563]: I1124 09:18:28.170048 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86c668996f-cnx27"] Nov 24 09:18:28 crc kubenswrapper[4563]: I1124 09:18:28.322599 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bd913ad3-2f58-4623-9c48-bde74b395f3f" containerName="glance-log" containerID="cri-o://29d4e65d72ef44b25bf575ada96dff280db99f3561ba615e5ed9e208bd1a5e91" gracePeriod=30 Nov 24 09:18:28 crc kubenswrapper[4563]: I1124 09:18:28.323022 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd913ad3-2f58-4623-9c48-bde74b395f3f","Type":"ContainerStarted","Data":"08ac424f10b91ce2161db1c77ea70b9ee51c645df45c9315b0970706a94d85cf"} Nov 24 09:18:28 crc kubenswrapper[4563]: I1124 09:18:28.323093 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bd913ad3-2f58-4623-9c48-bde74b395f3f" containerName="glance-httpd" containerID="cri-o://08ac424f10b91ce2161db1c77ea70b9ee51c645df45c9315b0970706a94d85cf" gracePeriod=30 Nov 24 09:18:28 crc kubenswrapper[4563]: I1124 09:18:28.329117 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" event={"ID":"f2330283-b2b3-4fe5-9924-bc2f48c08497","Type":"ContainerStarted","Data":"3dcb35494414940ca18de8743344ae135132b027cc72bd50e99f35ca4180e38b"} Nov 24 09:18:28 crc kubenswrapper[4563]: I1124 09:18:28.353996 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86c668996f-cnx27" event={"ID":"e6fe7506-2be4-49fb-9295-bf5960b88baf","Type":"ContainerStarted","Data":"f819c1b90505fc6be56127383b06724f8fe7403b0cfad7e693e4595fe2b40ffe"} Nov 24 09:18:29 crc kubenswrapper[4563]: I1124 09:18:29.069459 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b" path="/var/lib/kubelet/pods/2e7103ae-0d02-41c0-8c45-ce6ff67f2e1b/volumes" Nov 24 09:18:29 crc kubenswrapper[4563]: I1124 09:18:29.069983 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6c54567-0ce3-4ce9-a2b9-30e16c7760bc" path="/var/lib/kubelet/pods/b6c54567-0ce3-4ce9-a2b9-30e16c7760bc/volumes" Nov 24 09:18:29 crc kubenswrapper[4563]: I1124 09:18:29.417567 4563 generic.go:334] "Generic (PLEG): container finished" podID="bd913ad3-2f58-4623-9c48-bde74b395f3f" containerID="08ac424f10b91ce2161db1c77ea70b9ee51c645df45c9315b0970706a94d85cf" exitCode=0 Nov 24 09:18:29 crc kubenswrapper[4563]: I1124 09:18:29.417837 4563 generic.go:334] "Generic (PLEG): container finished" podID="bd913ad3-2f58-4623-9c48-bde74b395f3f" containerID="29d4e65d72ef44b25bf575ada96dff280db99f3561ba615e5ed9e208bd1a5e91" exitCode=143 Nov 24 09:18:29 crc kubenswrapper[4563]: I1124 09:18:29.417717 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd913ad3-2f58-4623-9c48-bde74b395f3f","Type":"ContainerDied","Data":"08ac424f10b91ce2161db1c77ea70b9ee51c645df45c9315b0970706a94d85cf"} Nov 24 09:18:29 crc kubenswrapper[4563]: I1124 09:18:29.418993 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd913ad3-2f58-4623-9c48-bde74b395f3f","Type":"ContainerDied","Data":"29d4e65d72ef44b25bf575ada96dff280db99f3561ba615e5ed9e208bd1a5e91"} Nov 24 09:18:29 crc kubenswrapper[4563]: I1124 09:18:29.419033 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:29 crc kubenswrapper[4563]: I1124 09:18:29.442293 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.442271921 podStartE2EDuration="5.442271921s" podCreationTimestamp="2025-11-24 09:18:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:28.344521285 +0000 UTC m=+885.603498732" watchObservedRunningTime="2025-11-24 09:18:29.442271921 +0000 UTC m=+886.701249368" Nov 24 09:18:30 crc kubenswrapper[4563]: I1124 09:18:30.433123 4563 generic.go:334] "Generic (PLEG): container finished" podID="6e3d5e6c-5414-4094-8673-4d2626067ce8" containerID="30a4d5df861b3290bb9630c2ee99164697fb288bb158d589ff153afed7599d4d" exitCode=0 Nov 24 09:18:30 crc kubenswrapper[4563]: I1124 09:18:30.433158 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dxkdq" event={"ID":"6e3d5e6c-5414-4094-8673-4d2626067ce8","Type":"ContainerDied","Data":"30a4d5df861b3290bb9630c2ee99164697fb288bb158d589ff153afed7599d4d"} Nov 24 09:18:30 crc kubenswrapper[4563]: I1124 09:18:30.449592 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" podStartSLOduration=6.449577784 podStartE2EDuration="6.449577784s" podCreationTimestamp="2025-11-24 09:18:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:29.442947846 +0000 UTC m=+886.701925294" watchObservedRunningTime="2025-11-24 09:18:30.449577784 +0000 UTC m=+887.708555231" Nov 24 09:18:31 crc kubenswrapper[4563]: I1124 09:18:31.744669 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.878045 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-config-data\") pod \"6e3d5e6c-5414-4094-8673-4d2626067ce8\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.878415 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-fernet-keys\") pod \"6e3d5e6c-5414-4094-8673-4d2626067ce8\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.878528 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m8m5\" (UniqueName: \"kubernetes.io/projected/6e3d5e6c-5414-4094-8673-4d2626067ce8-kube-api-access-4m8m5\") pod \"6e3d5e6c-5414-4094-8673-4d2626067ce8\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.878545 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-credential-keys\") pod \"6e3d5e6c-5414-4094-8673-4d2626067ce8\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.878689 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-combined-ca-bundle\") pod \"6e3d5e6c-5414-4094-8673-4d2626067ce8\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.878713 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-scripts\") pod \"6e3d5e6c-5414-4094-8673-4d2626067ce8\" (UID: \"6e3d5e6c-5414-4094-8673-4d2626067ce8\") " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.884087 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6e3d5e6c-5414-4094-8673-4d2626067ce8" (UID: "6e3d5e6c-5414-4094-8673-4d2626067ce8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.884786 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6e3d5e6c-5414-4094-8673-4d2626067ce8" (UID: "6e3d5e6c-5414-4094-8673-4d2626067ce8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.885272 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e3d5e6c-5414-4094-8673-4d2626067ce8-kube-api-access-4m8m5" (OuterVolumeSpecName: "kube-api-access-4m8m5") pod "6e3d5e6c-5414-4094-8673-4d2626067ce8" (UID: "6e3d5e6c-5414-4094-8673-4d2626067ce8"). InnerVolumeSpecName "kube-api-access-4m8m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.888625 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-scripts" (OuterVolumeSpecName: "scripts") pod "6e3d5e6c-5414-4094-8673-4d2626067ce8" (UID: "6e3d5e6c-5414-4094-8673-4d2626067ce8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.913840 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-config-data" (OuterVolumeSpecName: "config-data") pod "6e3d5e6c-5414-4094-8673-4d2626067ce8" (UID: "6e3d5e6c-5414-4094-8673-4d2626067ce8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.923867 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e3d5e6c-5414-4094-8673-4d2626067ce8" (UID: "6e3d5e6c-5414-4094-8673-4d2626067ce8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.955150 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rppmx"] Nov 24 09:18:33 crc kubenswrapper[4563]: E1124 09:18:31.955778 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3d5e6c-5414-4094-8673-4d2626067ce8" containerName="keystone-bootstrap" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.955795 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3d5e6c-5414-4094-8673-4d2626067ce8" containerName="keystone-bootstrap" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.955997 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3d5e6c-5414-4094-8673-4d2626067ce8" containerName="keystone-bootstrap" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.957572 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.970900 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rppmx"] Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.986194 4563 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.986224 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m8m5\" (UniqueName: \"kubernetes.io/projected/6e3d5e6c-5414-4094-8673-4d2626067ce8-kube-api-access-4m8m5\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.986246 4563 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.986256 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.986266 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:31.986276 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e3d5e6c-5414-4094-8673-4d2626067ce8-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.051398 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.087797 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/571d80f9-80d0-4dae-bfa0-126c8055f9b0-utilities\") pod \"redhat-marketplace-rppmx\" (UID: \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\") " pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.087893 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gsh7\" (UniqueName: \"kubernetes.io/projected/571d80f9-80d0-4dae-bfa0-126c8055f9b0-kube-api-access-8gsh7\") pod \"redhat-marketplace-rppmx\" (UID: \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\") " pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.088192 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/571d80f9-80d0-4dae-bfa0-126c8055f9b0-catalog-content\") pod \"redhat-marketplace-rppmx\" (UID: \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\") " pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.190139 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-config-data\") pod \"bd913ad3-2f58-4623-9c48-bde74b395f3f\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.190448 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd913ad3-2f58-4623-9c48-bde74b395f3f-httpd-run\") pod \"bd913ad3-2f58-4623-9c48-bde74b395f3f\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.190474 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-454vg\" (UniqueName: \"kubernetes.io/projected/bd913ad3-2f58-4623-9c48-bde74b395f3f-kube-api-access-454vg\") pod \"bd913ad3-2f58-4623-9c48-bde74b395f3f\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.190528 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-public-tls-certs\") pod \"bd913ad3-2f58-4623-9c48-bde74b395f3f\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.191044 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"bd913ad3-2f58-4623-9c48-bde74b395f3f\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.191090 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-combined-ca-bundle\") pod \"bd913ad3-2f58-4623-9c48-bde74b395f3f\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.191127 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-scripts\") pod \"bd913ad3-2f58-4623-9c48-bde74b395f3f\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.191158 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd913ad3-2f58-4623-9c48-bde74b395f3f-logs\") pod \"bd913ad3-2f58-4623-9c48-bde74b395f3f\" (UID: \"bd913ad3-2f58-4623-9c48-bde74b395f3f\") " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.191413 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/571d80f9-80d0-4dae-bfa0-126c8055f9b0-catalog-content\") pod \"redhat-marketplace-rppmx\" (UID: \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\") " pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.191444 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd913ad3-2f58-4623-9c48-bde74b395f3f-logs" (OuterVolumeSpecName: "logs") pod "bd913ad3-2f58-4623-9c48-bde74b395f3f" (UID: "bd913ad3-2f58-4623-9c48-bde74b395f3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.191493 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/571d80f9-80d0-4dae-bfa0-126c8055f9b0-utilities\") pod \"redhat-marketplace-rppmx\" (UID: \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\") " pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.191536 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gsh7\" (UniqueName: \"kubernetes.io/projected/571d80f9-80d0-4dae-bfa0-126c8055f9b0-kube-api-access-8gsh7\") pod \"redhat-marketplace-rppmx\" (UID: \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\") " pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.191617 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd913ad3-2f58-4623-9c48-bde74b395f3f-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.191789 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/571d80f9-80d0-4dae-bfa0-126c8055f9b0-catalog-content\") pod \"redhat-marketplace-rppmx\" (UID: \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\") " pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.192104 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/571d80f9-80d0-4dae-bfa0-126c8055f9b0-utilities\") pod \"redhat-marketplace-rppmx\" (UID: \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\") " pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.195570 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd913ad3-2f58-4623-9c48-bde74b395f3f-kube-api-access-454vg" (OuterVolumeSpecName: "kube-api-access-454vg") pod "bd913ad3-2f58-4623-9c48-bde74b395f3f" (UID: "bd913ad3-2f58-4623-9c48-bde74b395f3f"). InnerVolumeSpecName "kube-api-access-454vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.195633 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "bd913ad3-2f58-4623-9c48-bde74b395f3f" (UID: "bd913ad3-2f58-4623-9c48-bde74b395f3f"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.195794 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd913ad3-2f58-4623-9c48-bde74b395f3f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bd913ad3-2f58-4623-9c48-bde74b395f3f" (UID: "bd913ad3-2f58-4623-9c48-bde74b395f3f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.198565 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-scripts" (OuterVolumeSpecName: "scripts") pod "bd913ad3-2f58-4623-9c48-bde74b395f3f" (UID: "bd913ad3-2f58-4623-9c48-bde74b395f3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.206214 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gsh7\" (UniqueName: \"kubernetes.io/projected/571d80f9-80d0-4dae-bfa0-126c8055f9b0-kube-api-access-8gsh7\") pod \"redhat-marketplace-rppmx\" (UID: \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\") " pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.219428 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd913ad3-2f58-4623-9c48-bde74b395f3f" (UID: "bd913ad3-2f58-4623-9c48-bde74b395f3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.238839 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-config-data" (OuterVolumeSpecName: "config-data") pod "bd913ad3-2f58-4623-9c48-bde74b395f3f" (UID: "bd913ad3-2f58-4623-9c48-bde74b395f3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.247218 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bd913ad3-2f58-4623-9c48-bde74b395f3f" (UID: "bd913ad3-2f58-4623-9c48-bde74b395f3f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.293450 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.293478 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.293488 4563 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd913ad3-2f58-4623-9c48-bde74b395f3f-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.293497 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-454vg\" (UniqueName: \"kubernetes.io/projected/bd913ad3-2f58-4623-9c48-bde74b395f3f-kube-api-access-454vg\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.293507 4563 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.293541 4563 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.293550 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd913ad3-2f58-4623-9c48-bde74b395f3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.302077 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.311273 4563 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.395902 4563 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.448660 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dxkdq" event={"ID":"6e3d5e6c-5414-4094-8673-4d2626067ce8","Type":"ContainerDied","Data":"90a7ebe4d7ae0619fa9e6d41bcfddc48146b62cf9629e517aa0dd8f8e0e0d1b6"} Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.448931 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90a7ebe4d7ae0619fa9e6d41bcfddc48146b62cf9629e517aa0dd8f8e0e0d1b6" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.449004 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dxkdq" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.451372 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab","Type":"ContainerStarted","Data":"32e096ac55448cf12bc1f7b41cd77e0a5c191a51d755f53056683685c2fc8ecf"} Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.451477 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" containerName="glance-log" containerID="cri-o://4c4adf3fda70bd0403946d2ea1fe2270168e48af824d9046727bde6eaa33062b" gracePeriod=30 Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.451630 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" containerName="glance-httpd" containerID="cri-o://32e096ac55448cf12bc1f7b41cd77e0a5c191a51d755f53056683685c2fc8ecf" gracePeriod=30 Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.456050 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd913ad3-2f58-4623-9c48-bde74b395f3f","Type":"ContainerDied","Data":"79e6defd1d263ee36f5fb060bd5ca2d390179352fb7e916bb28e438134824392"} Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.456080 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.456103 4563 scope.go:117] "RemoveContainer" containerID="08ac424f10b91ce2161db1c77ea70b9ee51c645df45c9315b0970706a94d85cf" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.485763 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.485725384 podStartE2EDuration="8.485725384s" podCreationTimestamp="2025-11-24 09:18:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:32.471834181 +0000 UTC m=+889.730811629" watchObservedRunningTime="2025-11-24 09:18:32.485725384 +0000 UTC m=+889.744702831" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.492977 4563 scope.go:117] "RemoveContainer" containerID="29d4e65d72ef44b25bf575ada96dff280db99f3561ba615e5ed9e208bd1a5e91" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.498917 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.503911 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.521112 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:33 crc kubenswrapper[4563]: E1124 09:18:32.521522 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd913ad3-2f58-4623-9c48-bde74b395f3f" containerName="glance-httpd" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.521536 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd913ad3-2f58-4623-9c48-bde74b395f3f" containerName="glance-httpd" Nov 24 09:18:33 crc kubenswrapper[4563]: E1124 09:18:32.521558 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd913ad3-2f58-4623-9c48-bde74b395f3f" containerName="glance-log" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.521567 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd913ad3-2f58-4623-9c48-bde74b395f3f" containerName="glance-log" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.521761 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd913ad3-2f58-4623-9c48-bde74b395f3f" containerName="glance-httpd" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.521781 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd913ad3-2f58-4623-9c48-bde74b395f3f" containerName="glance-log" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.522649 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.525721 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.526180 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.529560 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.561464 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dxkdq"] Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.564616 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dxkdq"] Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.651713 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jh96b"] Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.652632 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.654342 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.654397 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.655806 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.656098 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sw8dk" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.658715 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.663976 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jh96b"] Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.700707 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.700763 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-scripts\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.700842 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cead1d3-35c0-4274-be9c-d75115aedd8a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.700882 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.700905 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cead1d3-35c0-4274-be9c-d75115aedd8a-logs\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.700926 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.700976 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5tcc\" (UniqueName: \"kubernetes.io/projected/6cead1d3-35c0-4274-be9c-d75115aedd8a-kube-api-access-d5tcc\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.701016 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-config-data\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.802614 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-scripts\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.802697 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.802727 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-scripts\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.802769 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-fernet-keys\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.802795 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cead1d3-35c0-4274-be9c-d75115aedd8a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.802817 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-combined-ca-bundle\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.802840 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-credential-keys\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.802858 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.802875 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cead1d3-35c0-4274-be9c-d75115aedd8a-logs\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.802891 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.802906 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-config-data\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.802946 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5tcc\" (UniqueName: \"kubernetes.io/projected/6cead1d3-35c0-4274-be9c-d75115aedd8a-kube-api-access-d5tcc\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.802974 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-config-data\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.803015 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlv86\" (UniqueName: \"kubernetes.io/projected/539b9102-6a58-4804-8b35-4b183ef45c82-kube-api-access-nlv86\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.803369 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.805769 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cead1d3-35c0-4274-be9c-d75115aedd8a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.807612 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cead1d3-35c0-4274-be9c-d75115aedd8a-logs\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.809779 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.809781 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-scripts\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.814266 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-config-data\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.815151 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.821237 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5tcc\" (UniqueName: \"kubernetes.io/projected/6cead1d3-35c0-4274-be9c-d75115aedd8a-kube-api-access-d5tcc\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.823361 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.838312 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.905102 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlv86\" (UniqueName: \"kubernetes.io/projected/539b9102-6a58-4804-8b35-4b183ef45c82-kube-api-access-nlv86\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.905155 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-scripts\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.905225 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-fernet-keys\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.905253 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-combined-ca-bundle\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.905280 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-credential-keys\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.905306 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-config-data\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.909606 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-scripts\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.909844 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-credential-keys\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.909940 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-combined-ca-bundle\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.910610 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-fernet-keys\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.912922 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-config-data\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:32.923692 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlv86\" (UniqueName: \"kubernetes.io/projected/539b9102-6a58-4804-8b35-4b183ef45c82-kube-api-access-nlv86\") pod \"keystone-bootstrap-jh96b\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:33.068191 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:33.099390 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e3d5e6c-5414-4094-8673-4d2626067ce8" path="/var/lib/kubelet/pods/6e3d5e6c-5414-4094-8673-4d2626067ce8/volumes" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:33.100340 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd913ad3-2f58-4623-9c48-bde74b395f3f" path="/var/lib/kubelet/pods/bd913ad3-2f58-4623-9c48-bde74b395f3f/volumes" Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:33.468909 4563 generic.go:334] "Generic (PLEG): container finished" podID="ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" containerID="32e096ac55448cf12bc1f7b41cd77e0a5c191a51d755f53056683685c2fc8ecf" exitCode=0 Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:33.468934 4563 generic.go:334] "Generic (PLEG): container finished" podID="ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" containerID="4c4adf3fda70bd0403946d2ea1fe2270168e48af824d9046727bde6eaa33062b" exitCode=143 Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:33.468972 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab","Type":"ContainerDied","Data":"32e096ac55448cf12bc1f7b41cd77e0a5c191a51d755f53056683685c2fc8ecf"} Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:33.469000 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab","Type":"ContainerDied","Data":"4c4adf3fda70bd0403946d2ea1fe2270168e48af824d9046727bde6eaa33062b"} Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:33.834278 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rppmx"] Nov 24 09:18:33 crc kubenswrapper[4563]: I1124 09:18:33.913493 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.296755 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-895bd5bbf-mc6v8"] Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.321901 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7cd5c59c66-hrmf5"] Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.323681 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.328839 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.330259 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.336992 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cd5c59c66-hrmf5"] Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.432560 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-horizon-secret-key\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.432601 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-horizon-tls-certs\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.432652 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ec5b651-57ef-414b-8c8e-4b488d71663f-logs\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.432720 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2sch\" (UniqueName: \"kubernetes.io/projected/0ec5b651-57ef-414b-8c8e-4b488d71663f-kube-api-access-d2sch\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.432756 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ec5b651-57ef-414b-8c8e-4b488d71663f-scripts\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.432784 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-combined-ca-bundle\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.432815 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ec5b651-57ef-414b-8c8e-4b488d71663f-config-data\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.471753 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86c668996f-cnx27"] Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.496732 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d999bbd6-cqj6s"] Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.499286 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.513156 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d999bbd6-cqj6s"] Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.534223 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-horizon-secret-key\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.534266 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-horizon-tls-certs\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.534319 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ec5b651-57ef-414b-8c8e-4b488d71663f-logs\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.534451 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2sch\" (UniqueName: \"kubernetes.io/projected/0ec5b651-57ef-414b-8c8e-4b488d71663f-kube-api-access-d2sch\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.534513 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ec5b651-57ef-414b-8c8e-4b488d71663f-scripts\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.534560 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-combined-ca-bundle\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.534611 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ec5b651-57ef-414b-8c8e-4b488d71663f-config-data\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.535085 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ec5b651-57ef-414b-8c8e-4b488d71663f-logs\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.542002 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ec5b651-57ef-414b-8c8e-4b488d71663f-config-data\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.542824 4563 generic.go:334] "Generic (PLEG): container finished" podID="d128ad0e-d5fb-4f46-a737-b68fb15b7dc6" containerID="cd9a6e0bb9cf33eeb070fb668ea6ead3754b52490ba66ba3b99f0717f840c0b9" exitCode=0 Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.542889 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ngwrz" event={"ID":"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6","Type":"ContainerDied","Data":"cd9a6e0bb9cf33eeb070fb668ea6ead3754b52490ba66ba3b99f0717f840c0b9"} Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.547436 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ec5b651-57ef-414b-8c8e-4b488d71663f-scripts\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.565206 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-horizon-secret-key\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.565727 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-combined-ca-bundle\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.578092 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2sch\" (UniqueName: \"kubernetes.io/projected/0ec5b651-57ef-414b-8c8e-4b488d71663f-kube-api-access-d2sch\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.585037 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-horizon-tls-certs\") pod \"horizon-7cd5c59c66-hrmf5\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.636862 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7688cb4-70ea-43e4-85f2-6b96f972538f-combined-ca-bundle\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.636917 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a7688cb4-70ea-43e4-85f2-6b96f972538f-horizon-secret-key\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.637045 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd9cc\" (UniqueName: \"kubernetes.io/projected/a7688cb4-70ea-43e4-85f2-6b96f972538f-kube-api-access-fd9cc\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.637074 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7688cb4-70ea-43e4-85f2-6b96f972538f-logs\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.637098 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7688cb4-70ea-43e4-85f2-6b96f972538f-horizon-tls-certs\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.637139 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a7688cb4-70ea-43e4-85f2-6b96f972538f-config-data\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.637168 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7688cb4-70ea-43e4-85f2-6b96f972538f-scripts\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.648622 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.738971 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd9cc\" (UniqueName: \"kubernetes.io/projected/a7688cb4-70ea-43e4-85f2-6b96f972538f-kube-api-access-fd9cc\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.739358 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7688cb4-70ea-43e4-85f2-6b96f972538f-logs\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.739390 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7688cb4-70ea-43e4-85f2-6b96f972538f-horizon-tls-certs\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.739830 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a7688cb4-70ea-43e4-85f2-6b96f972538f-config-data\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.739870 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7688cb4-70ea-43e4-85f2-6b96f972538f-scripts\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.739901 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7688cb4-70ea-43e4-85f2-6b96f972538f-combined-ca-bundle\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.739921 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a7688cb4-70ea-43e4-85f2-6b96f972538f-horizon-secret-key\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.739779 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7688cb4-70ea-43e4-85f2-6b96f972538f-logs\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.740924 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7688cb4-70ea-43e4-85f2-6b96f972538f-scripts\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.741078 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a7688cb4-70ea-43e4-85f2-6b96f972538f-config-data\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.742939 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a7688cb4-70ea-43e4-85f2-6b96f972538f-horizon-secret-key\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.743276 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7688cb4-70ea-43e4-85f2-6b96f972538f-horizon-tls-certs\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.747076 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7688cb4-70ea-43e4-85f2-6b96f972538f-combined-ca-bundle\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.754139 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd9cc\" (UniqueName: \"kubernetes.io/projected/a7688cb4-70ea-43e4-85f2-6b96f972538f-kube-api-access-fd9cc\") pod \"horizon-6d999bbd6-cqj6s\" (UID: \"a7688cb4-70ea-43e4-85f2-6b96f972538f\") " pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.846298 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:34 crc kubenswrapper[4563]: I1124 09:18:34.943811 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.045101 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmrkt\" (UniqueName: \"kubernetes.io/projected/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-kube-api-access-pmrkt\") pod \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.045261 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-httpd-run\") pod \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.045311 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.045359 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-combined-ca-bundle\") pod \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.045426 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-scripts\") pod \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.045537 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-config-data\") pod \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.045559 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-internal-tls-certs\") pod \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.045700 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-logs\") pod \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\" (UID: \"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab\") " Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.046669 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-logs" (OuterVolumeSpecName: "logs") pod "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" (UID: "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.049602 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-kube-api-access-pmrkt" (OuterVolumeSpecName: "kube-api-access-pmrkt") pod "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" (UID: "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab"). InnerVolumeSpecName "kube-api-access-pmrkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.049868 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" (UID: "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.050110 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-scripts" (OuterVolumeSpecName: "scripts") pod "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" (UID: "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.053972 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" (UID: "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.070443 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" (UID: "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.086290 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-config-data" (OuterVolumeSpecName: "config-data") pod "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" (UID: "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.095562 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" (UID: "ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.147629 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.147670 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmrkt\" (UniqueName: \"kubernetes.io/projected/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-kube-api-access-pmrkt\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.147683 4563 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.147708 4563 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.147718 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.147727 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.147736 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.147745 4563 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.163081 4563 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.250303 4563 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.474843 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.529865 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9fdb784c-47fgk"] Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.533893 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" podUID="46649dc4-4337-4378-a0a1-70b329141a22" containerName="dnsmasq-dns" containerID="cri-o://e7636c149a4b5bd1098aa169a69ea8d47f767d6a95e9db54cf48fdd3ebc2b10b" gracePeriod=10 Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.557266 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab","Type":"ContainerDied","Data":"a3eb24d9d3a77b7a2de2449ca3c14fc4753de383a772384c64af3a5af38fd5ce"} Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.557335 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.557362 4563 scope.go:117] "RemoveContainer" containerID="32e096ac55448cf12bc1f7b41cd77e0a5c191a51d755f53056683685c2fc8ecf" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.625575 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.636469 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.640868 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:18:35 crc kubenswrapper[4563]: E1124 09:18:35.641338 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" containerName="glance-log" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.641559 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" containerName="glance-log" Nov 24 09:18:35 crc kubenswrapper[4563]: E1124 09:18:35.641731 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" containerName="glance-httpd" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.641800 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" containerName="glance-httpd" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.642077 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" containerName="glance-httpd" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.642151 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" containerName="glance-log" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.643019 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.645950 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.647071 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.656722 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.762700 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126732ea-0f6a-4fd6-9b5b-959e4da904fe-logs\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.763000 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.763024 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.763066 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126732ea-0f6a-4fd6-9b5b-959e4da904fe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.763151 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.763192 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.763221 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.763253 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wblvn\" (UniqueName: \"kubernetes.io/projected/126732ea-0f6a-4fd6-9b5b-959e4da904fe-kube-api-access-wblvn\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.764802 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r9lpl"] Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.766804 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.778399 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r9lpl"] Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.865505 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c74cc\" (UniqueName: \"kubernetes.io/projected/e26141fd-4cfa-4726-ba65-1f3bb830411b-kube-api-access-c74cc\") pod \"redhat-operators-r9lpl\" (UID: \"e26141fd-4cfa-4726-ba65-1f3bb830411b\") " pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.865550 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.865581 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.865599 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e26141fd-4cfa-4726-ba65-1f3bb830411b-utilities\") pod \"redhat-operators-r9lpl\" (UID: \"e26141fd-4cfa-4726-ba65-1f3bb830411b\") " pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.865626 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wblvn\" (UniqueName: \"kubernetes.io/projected/126732ea-0f6a-4fd6-9b5b-959e4da904fe-kube-api-access-wblvn\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.865679 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e26141fd-4cfa-4726-ba65-1f3bb830411b-catalog-content\") pod \"redhat-operators-r9lpl\" (UID: \"e26141fd-4cfa-4726-ba65-1f3bb830411b\") " pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.865706 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126732ea-0f6a-4fd6-9b5b-959e4da904fe-logs\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.865750 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.865765 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.865789 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126732ea-0f6a-4fd6-9b5b-959e4da904fe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.865832 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.866215 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.870029 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126732ea-0f6a-4fd6-9b5b-959e4da904fe-logs\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.872049 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126732ea-0f6a-4fd6-9b5b-959e4da904fe-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.885586 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-scripts\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.887515 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.891595 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.894284 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-config-data\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.909700 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wblvn\" (UniqueName: \"kubernetes.io/projected/126732ea-0f6a-4fd6-9b5b-959e4da904fe-kube-api-access-wblvn\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.910866 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.967743 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e26141fd-4cfa-4726-ba65-1f3bb830411b-catalog-content\") pod \"redhat-operators-r9lpl\" (UID: \"e26141fd-4cfa-4726-ba65-1f3bb830411b\") " pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.967987 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c74cc\" (UniqueName: \"kubernetes.io/projected/e26141fd-4cfa-4726-ba65-1f3bb830411b-kube-api-access-c74cc\") pod \"redhat-operators-r9lpl\" (UID: \"e26141fd-4cfa-4726-ba65-1f3bb830411b\") " pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.968037 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e26141fd-4cfa-4726-ba65-1f3bb830411b-utilities\") pod \"redhat-operators-r9lpl\" (UID: \"e26141fd-4cfa-4726-ba65-1f3bb830411b\") " pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.968490 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e26141fd-4cfa-4726-ba65-1f3bb830411b-utilities\") pod \"redhat-operators-r9lpl\" (UID: \"e26141fd-4cfa-4726-ba65-1f3bb830411b\") " pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.968841 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e26141fd-4cfa-4726-ba65-1f3bb830411b-catalog-content\") pod \"redhat-operators-r9lpl\" (UID: \"e26141fd-4cfa-4726-ba65-1f3bb830411b\") " pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.974472 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:35 crc kubenswrapper[4563]: I1124 09:18:35.982138 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c74cc\" (UniqueName: \"kubernetes.io/projected/e26141fd-4cfa-4726-ba65-1f3bb830411b-kube-api-access-c74cc\") pod \"redhat-operators-r9lpl\" (UID: \"e26141fd-4cfa-4726-ba65-1f3bb830411b\") " pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:18:36 crc kubenswrapper[4563]: I1124 09:18:36.133277 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:18:36 crc kubenswrapper[4563]: I1124 09:18:36.574030 4563 generic.go:334] "Generic (PLEG): container finished" podID="46649dc4-4337-4378-a0a1-70b329141a22" containerID="e7636c149a4b5bd1098aa169a69ea8d47f767d6a95e9db54cf48fdd3ebc2b10b" exitCode=0 Nov 24 09:18:36 crc kubenswrapper[4563]: I1124 09:18:36.574087 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" event={"ID":"46649dc4-4337-4378-a0a1-70b329141a22","Type":"ContainerDied","Data":"e7636c149a4b5bd1098aa169a69ea8d47f767d6a95e9db54cf48fdd3ebc2b10b"} Nov 24 09:18:37 crc kubenswrapper[4563]: I1124 09:18:37.063535 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab" path="/var/lib/kubelet/pods/ef7d1434-5a60-4b4b-8ad7-ac8e32ebd5ab/volumes" Nov 24 09:18:37 crc kubenswrapper[4563]: W1124 09:18:37.117945 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod571d80f9_80d0_4dae_bfa0_126c8055f9b0.slice/crio-db91f4523ae27bdd03dd5e4bd236394ff19a0e5fb5172e04beaf6a8d4bafa43d WatchSource:0}: Error finding container db91f4523ae27bdd03dd5e4bd236394ff19a0e5fb5172e04beaf6a8d4bafa43d: Status 404 returned error can't find the container with id db91f4523ae27bdd03dd5e4bd236394ff19a0e5fb5172e04beaf6a8d4bafa43d Nov 24 09:18:37 crc kubenswrapper[4563]: W1124 09:18:37.119564 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cead1d3_35c0_4274_be9c_d75115aedd8a.slice/crio-144eee9f484b95119b714357ff67b9c8e69e0585b73380ad334091e47571516d WatchSource:0}: Error finding container 144eee9f484b95119b714357ff67b9c8e69e0585b73380ad334091e47571516d: Status 404 returned error can't find the container with id 144eee9f484b95119b714357ff67b9c8e69e0585b73380ad334091e47571516d Nov 24 09:18:37 crc kubenswrapper[4563]: I1124 09:18:37.581731 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rppmx" event={"ID":"571d80f9-80d0-4dae-bfa0-126c8055f9b0","Type":"ContainerStarted","Data":"db91f4523ae27bdd03dd5e4bd236394ff19a0e5fb5172e04beaf6a8d4bafa43d"} Nov 24 09:18:37 crc kubenswrapper[4563]: I1124 09:18:37.582991 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6cead1d3-35c0-4274-be9c-d75115aedd8a","Type":"ContainerStarted","Data":"144eee9f484b95119b714357ff67b9c8e69e0585b73380ad334091e47571516d"} Nov 24 09:18:38 crc kubenswrapper[4563]: I1124 09:18:38.289516 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" podUID="46649dc4-4337-4378-a0a1-70b329141a22" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Nov 24 09:18:40 crc kubenswrapper[4563]: E1124 09:18:40.941423 4563 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057" Nov 24 09:18:40 crc kubenswrapper[4563]: E1124 09:18:40.941775 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfh5fch56h645h548h5b9h55hf8h66ch74h594h55dh68h59fh5bch649h84h589h86h66bh9bh589h579h697h649h97h65chb9h5fdh5b4h5b4h54fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zxxm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5ffd6f76f7-4hmjv_openstack(0edbf62a-a3fc-416a-a94d-395da81b7b63): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 09:18:40 crc kubenswrapper[4563]: E1124 09:18:40.944488 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057\\\"\"]" pod="openstack/horizon-5ffd6f76f7-4hmjv" podUID="0edbf62a-a3fc-416a-a94d-395da81b7b63" Nov 24 09:18:43 crc kubenswrapper[4563]: I1124 09:18:43.289931 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" podUID="46649dc4-4337-4378-a0a1-70b329141a22" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Nov 24 09:18:47 crc kubenswrapper[4563]: E1124 09:18:47.611374 4563 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:7dd2e0dbb6bb5a6cecd1763e43479ca8cb6a0c502534e83c8795c0da2b50e099" Nov 24 09:18:47 crc kubenswrapper[4563]: E1124 09:18:47.611835 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:7dd2e0dbb6bb5a6cecd1763e43479ca8cb6a0c502534e83c8795c0da2b50e099,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-npgkh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-8klw7_openstack(d803d6ca-646a-4dd5-93ef-d096b501c28a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 09:18:47 crc kubenswrapper[4563]: E1124 09:18:47.613298 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-8klw7" podUID="d803d6ca-646a-4dd5-93ef-d096b501c28a" Nov 24 09:18:47 crc kubenswrapper[4563]: E1124 09:18:47.621950 4563 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057" Nov 24 09:18:47 crc kubenswrapper[4563]: E1124 09:18:47.622131 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7ch674h57ch666h55dhdhd7h594h55h57dh58bh649h596h5fch578h678h5c4h5f9h585h7ch5f9h5fh545h84h5bdh678h567h55dh54dh5cdh66dhf5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-68lxl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-86c668996f-cnx27_openstack(e6fe7506-2be4-49fb-9295-bf5960b88baf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 09:18:47 crc kubenswrapper[4563]: E1124 09:18:47.625192 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057\\\"\"]" pod="openstack/horizon-86c668996f-cnx27" podUID="e6fe7506-2be4-49fb-9295-bf5960b88baf" Nov 24 09:18:47 crc kubenswrapper[4563]: E1124 09:18:47.626196 4563 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057" Nov 24 09:18:47 crc kubenswrapper[4563]: E1124 09:18:47.626360 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5ffh68bh68h66ch6fh59dh647h9bhc5h5d9h5ddh5bbh68bhbdh555h68dh8bh65fh5ddh648h5bdh548h597hc8hdfh87h87h67hffh87h9fh75q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bq4pl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-895bd5bbf-mc6v8_openstack(38e4b9dc-4234-4602-9111-514d6d94e10b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 09:18:47 crc kubenswrapper[4563]: E1124 09:18:47.628160 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057\\\"\"]" pod="openstack/horizon-895bd5bbf-mc6v8" podUID="38e4b9dc-4234-4602-9111-514d6d94e10b" Nov 24 09:18:47 crc kubenswrapper[4563]: I1124 09:18:47.675745 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ngwrz" Nov 24 09:18:47 crc kubenswrapper[4563]: I1124 09:18:47.676206 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ngwrz" event={"ID":"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6","Type":"ContainerDied","Data":"f60033dc81098f9cf3acee6f0c67e4ef3ed6d30b9886b9edab123060ff055e53"} Nov 24 09:18:47 crc kubenswrapper[4563]: I1124 09:18:47.676237 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f60033dc81098f9cf3acee6f0c67e4ef3ed6d30b9886b9edab123060ff055e53" Nov 24 09:18:47 crc kubenswrapper[4563]: E1124 09:18:47.678202 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:7dd2e0dbb6bb5a6cecd1763e43479ca8cb6a0c502534e83c8795c0da2b50e099\\\"\"" pod="openstack/placement-db-sync-8klw7" podUID="d803d6ca-646a-4dd5-93ef-d096b501c28a" Nov 24 09:18:47 crc kubenswrapper[4563]: I1124 09:18:47.698943 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-config\") pod \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\" (UID: \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\") " Nov 24 09:18:47 crc kubenswrapper[4563]: I1124 09:18:47.698999 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p8ls\" (UniqueName: \"kubernetes.io/projected/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-kube-api-access-8p8ls\") pod \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\" (UID: \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\") " Nov 24 09:18:47 crc kubenswrapper[4563]: I1124 09:18:47.699179 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-combined-ca-bundle\") pod \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\" (UID: \"d128ad0e-d5fb-4f46-a737-b68fb15b7dc6\") " Nov 24 09:18:47 crc kubenswrapper[4563]: I1124 09:18:47.709727 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-kube-api-access-8p8ls" (OuterVolumeSpecName: "kube-api-access-8p8ls") pod "d128ad0e-d5fb-4f46-a737-b68fb15b7dc6" (UID: "d128ad0e-d5fb-4f46-a737-b68fb15b7dc6"). InnerVolumeSpecName "kube-api-access-8p8ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:47 crc kubenswrapper[4563]: I1124 09:18:47.726929 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-config" (OuterVolumeSpecName: "config") pod "d128ad0e-d5fb-4f46-a737-b68fb15b7dc6" (UID: "d128ad0e-d5fb-4f46-a737-b68fb15b7dc6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:47 crc kubenswrapper[4563]: I1124 09:18:47.751254 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d128ad0e-d5fb-4f46-a737-b68fb15b7dc6" (UID: "d128ad0e-d5fb-4f46-a737-b68fb15b7dc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:47 crc kubenswrapper[4563]: I1124 09:18:47.800958 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:47 crc kubenswrapper[4563]: I1124 09:18:47.801246 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:47 crc kubenswrapper[4563]: I1124 09:18:47.801259 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p8ls\" (UniqueName: \"kubernetes.io/projected/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6-kube-api-access-8p8ls\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: E1124 09:18:48.115677 4563 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:4c93a5cccb9971e24f05daf93b3aa11ba71752bc3469a1a1a2c4906f92f69645" Nov 24 09:18:48 crc kubenswrapper[4563]: E1124 09:18:48.115865 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:4c93a5cccb9971e24f05daf93b3aa11ba71752bc3469a1a1a2c4906f92f69645,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tjgcb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-frjwb_openstack(ed99b33d-c985-44bf-9a4a-b9f93bf3927f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 09:18:48 crc kubenswrapper[4563]: E1124 09:18:48.117116 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-frjwb" podUID="ed99b33d-c985-44bf-9a4a-b9f93bf3927f" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.117879 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.124793 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.135255 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.310731 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-ovsdbserver-sb\") pod \"46649dc4-4337-4378-a0a1-70b329141a22\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.310781 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0edbf62a-a3fc-416a-a94d-395da81b7b63-horizon-secret-key\") pod \"0edbf62a-a3fc-416a-a94d-395da81b7b63\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.310815 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0edbf62a-a3fc-416a-a94d-395da81b7b63-config-data\") pod \"0edbf62a-a3fc-416a-a94d-395da81b7b63\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.310842 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7kvz\" (UniqueName: \"kubernetes.io/projected/46649dc4-4337-4378-a0a1-70b329141a22-kube-api-access-l7kvz\") pod \"46649dc4-4337-4378-a0a1-70b329141a22\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.310901 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0edbf62a-a3fc-416a-a94d-395da81b7b63-scripts\") pod \"0edbf62a-a3fc-416a-a94d-395da81b7b63\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.311011 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edbf62a-a3fc-416a-a94d-395da81b7b63-logs\") pod \"0edbf62a-a3fc-416a-a94d-395da81b7b63\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.311063 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-dns-svc\") pod \"46649dc4-4337-4378-a0a1-70b329141a22\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.311085 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxxm9\" (UniqueName: \"kubernetes.io/projected/0edbf62a-a3fc-416a-a94d-395da81b7b63-kube-api-access-zxxm9\") pod \"0edbf62a-a3fc-416a-a94d-395da81b7b63\" (UID: \"0edbf62a-a3fc-416a-a94d-395da81b7b63\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.311132 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/38e4b9dc-4234-4602-9111-514d6d94e10b-horizon-secret-key\") pod \"38e4b9dc-4234-4602-9111-514d6d94e10b\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.311169 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38e4b9dc-4234-4602-9111-514d6d94e10b-logs\") pod \"38e4b9dc-4234-4602-9111-514d6d94e10b\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.311213 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-ovsdbserver-nb\") pod \"46649dc4-4337-4378-a0a1-70b329141a22\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.311288 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38e4b9dc-4234-4602-9111-514d6d94e10b-scripts\") pod \"38e4b9dc-4234-4602-9111-514d6d94e10b\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.311306 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-config\") pod \"46649dc4-4337-4378-a0a1-70b329141a22\" (UID: \"46649dc4-4337-4378-a0a1-70b329141a22\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.311322 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq4pl\" (UniqueName: \"kubernetes.io/projected/38e4b9dc-4234-4602-9111-514d6d94e10b-kube-api-access-bq4pl\") pod \"38e4b9dc-4234-4602-9111-514d6d94e10b\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.311356 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38e4b9dc-4234-4602-9111-514d6d94e10b-config-data\") pod \"38e4b9dc-4234-4602-9111-514d6d94e10b\" (UID: \"38e4b9dc-4234-4602-9111-514d6d94e10b\") " Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.311451 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0edbf62a-a3fc-416a-a94d-395da81b7b63-logs" (OuterVolumeSpecName: "logs") pod "0edbf62a-a3fc-416a-a94d-395da81b7b63" (UID: "0edbf62a-a3fc-416a-a94d-395da81b7b63"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.311796 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0edbf62a-a3fc-416a-a94d-395da81b7b63-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.312032 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38e4b9dc-4234-4602-9111-514d6d94e10b-logs" (OuterVolumeSpecName: "logs") pod "38e4b9dc-4234-4602-9111-514d6d94e10b" (UID: "38e4b9dc-4234-4602-9111-514d6d94e10b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.312213 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0edbf62a-a3fc-416a-a94d-395da81b7b63-scripts" (OuterVolumeSpecName: "scripts") pod "0edbf62a-a3fc-416a-a94d-395da81b7b63" (UID: "0edbf62a-a3fc-416a-a94d-395da81b7b63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.312256 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38e4b9dc-4234-4602-9111-514d6d94e10b-scripts" (OuterVolumeSpecName: "scripts") pod "38e4b9dc-4234-4602-9111-514d6d94e10b" (UID: "38e4b9dc-4234-4602-9111-514d6d94e10b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.312395 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38e4b9dc-4234-4602-9111-514d6d94e10b-config-data" (OuterVolumeSpecName: "config-data") pod "38e4b9dc-4234-4602-9111-514d6d94e10b" (UID: "38e4b9dc-4234-4602-9111-514d6d94e10b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.312517 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0edbf62a-a3fc-416a-a94d-395da81b7b63-config-data" (OuterVolumeSpecName: "config-data") pod "0edbf62a-a3fc-416a-a94d-395da81b7b63" (UID: "0edbf62a-a3fc-416a-a94d-395da81b7b63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.315729 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46649dc4-4337-4378-a0a1-70b329141a22-kube-api-access-l7kvz" (OuterVolumeSpecName: "kube-api-access-l7kvz") pod "46649dc4-4337-4378-a0a1-70b329141a22" (UID: "46649dc4-4337-4378-a0a1-70b329141a22"). InnerVolumeSpecName "kube-api-access-l7kvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.316399 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0edbf62a-a3fc-416a-a94d-395da81b7b63-kube-api-access-zxxm9" (OuterVolumeSpecName: "kube-api-access-zxxm9") pod "0edbf62a-a3fc-416a-a94d-395da81b7b63" (UID: "0edbf62a-a3fc-416a-a94d-395da81b7b63"). InnerVolumeSpecName "kube-api-access-zxxm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.316745 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38e4b9dc-4234-4602-9111-514d6d94e10b-kube-api-access-bq4pl" (OuterVolumeSpecName: "kube-api-access-bq4pl") pod "38e4b9dc-4234-4602-9111-514d6d94e10b" (UID: "38e4b9dc-4234-4602-9111-514d6d94e10b"). InnerVolumeSpecName "kube-api-access-bq4pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.316932 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38e4b9dc-4234-4602-9111-514d6d94e10b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "38e4b9dc-4234-4602-9111-514d6d94e10b" (UID: "38e4b9dc-4234-4602-9111-514d6d94e10b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.317095 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0edbf62a-a3fc-416a-a94d-395da81b7b63-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0edbf62a-a3fc-416a-a94d-395da81b7b63" (UID: "0edbf62a-a3fc-416a-a94d-395da81b7b63"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.343181 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-config" (OuterVolumeSpecName: "config") pod "46649dc4-4337-4378-a0a1-70b329141a22" (UID: "46649dc4-4337-4378-a0a1-70b329141a22"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.345771 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46649dc4-4337-4378-a0a1-70b329141a22" (UID: "46649dc4-4337-4378-a0a1-70b329141a22"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.345951 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "46649dc4-4337-4378-a0a1-70b329141a22" (UID: "46649dc4-4337-4378-a0a1-70b329141a22"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.348704 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "46649dc4-4337-4378-a0a1-70b329141a22" (UID: "46649dc4-4337-4378-a0a1-70b329141a22"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.414614 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.414669 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxxm9\" (UniqueName: \"kubernetes.io/projected/0edbf62a-a3fc-416a-a94d-395da81b7b63-kube-api-access-zxxm9\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.414688 4563 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/38e4b9dc-4234-4602-9111-514d6d94e10b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.414706 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38e4b9dc-4234-4602-9111-514d6d94e10b-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.414717 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.414729 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38e4b9dc-4234-4602-9111-514d6d94e10b-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.414741 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.414752 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq4pl\" (UniqueName: \"kubernetes.io/projected/38e4b9dc-4234-4602-9111-514d6d94e10b-kube-api-access-bq4pl\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.414763 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38e4b9dc-4234-4602-9111-514d6d94e10b-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.414778 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46649dc4-4337-4378-a0a1-70b329141a22-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.414790 4563 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0edbf62a-a3fc-416a-a94d-395da81b7b63-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.414801 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0edbf62a-a3fc-416a-a94d-395da81b7b63-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.414811 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7kvz\" (UniqueName: \"kubernetes.io/projected/46649dc4-4337-4378-a0a1-70b329141a22-kube-api-access-l7kvz\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.414823 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0edbf62a-a3fc-416a-a94d-395da81b7b63-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.689919 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5ffd6f76f7-4hmjv" event={"ID":"0edbf62a-a3fc-416a-a94d-395da81b7b63","Type":"ContainerDied","Data":"e6bb6393ae9d4a041574d550aae78f91030f528cd1ba4a9894b25b3427b41bd9"} Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.690022 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5ffd6f76f7-4hmjv" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.699475 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" event={"ID":"46649dc4-4337-4378-a0a1-70b329141a22","Type":"ContainerDied","Data":"96e2d1582694ad6ebdf3eb01743977005afb8a7a84eeea0d1bac2fdbc1d90368"} Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.699557 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9fdb784c-47fgk" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.705044 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ngwrz" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.707003 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-895bd5bbf-mc6v8" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.709072 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-895bd5bbf-mc6v8" event={"ID":"38e4b9dc-4234-4602-9111-514d6d94e10b","Type":"ContainerDied","Data":"e0b58534c12d604038f6d0afadd31cac9767018079a7a2aa3ca1db14d4a2c559"} Nov 24 09:18:48 crc kubenswrapper[4563]: E1124 09:18:48.710488 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:4c93a5cccb9971e24f05daf93b3aa11ba71752bc3469a1a1a2c4906f92f69645\\\"\"" pod="openstack/barbican-db-sync-frjwb" podUID="ed99b33d-c985-44bf-9a4a-b9f93bf3927f" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.772486 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5ffd6f76f7-4hmjv"] Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.785141 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5ffd6f76f7-4hmjv"] Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.798587 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9fdb784c-47fgk"] Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.805240 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9fdb784c-47fgk"] Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.816702 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-895bd5bbf-mc6v8"] Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.822253 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-895bd5bbf-mc6v8"] Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.896219 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c654c9745-9ss87"] Nov 24 09:18:48 crc kubenswrapper[4563]: E1124 09:18:48.896589 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46649dc4-4337-4378-a0a1-70b329141a22" containerName="dnsmasq-dns" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.896602 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="46649dc4-4337-4378-a0a1-70b329141a22" containerName="dnsmasq-dns" Nov 24 09:18:48 crc kubenswrapper[4563]: E1124 09:18:48.896736 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46649dc4-4337-4378-a0a1-70b329141a22" containerName="init" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.896746 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="46649dc4-4337-4378-a0a1-70b329141a22" containerName="init" Nov 24 09:18:48 crc kubenswrapper[4563]: E1124 09:18:48.896760 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d128ad0e-d5fb-4f46-a737-b68fb15b7dc6" containerName="neutron-db-sync" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.896767 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d128ad0e-d5fb-4f46-a737-b68fb15b7dc6" containerName="neutron-db-sync" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.896929 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="d128ad0e-d5fb-4f46-a737-b68fb15b7dc6" containerName="neutron-db-sync" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.896946 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="46649dc4-4337-4378-a0a1-70b329141a22" containerName="dnsmasq-dns" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.898103 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.908513 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c654c9745-9ss87"] Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.946564 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nctdc\" (UniqueName: \"kubernetes.io/projected/1213cb1f-843a-4e6c-b56c-cf39c8108614-kube-api-access-nctdc\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.946677 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-dns-swift-storage-0\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.946716 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-ovsdbserver-sb\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.946733 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-ovsdbserver-nb\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.946753 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-config\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:48 crc kubenswrapper[4563]: I1124 09:18:48.946809 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-dns-svc\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.048883 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-dns-svc\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.049189 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nctdc\" (UniqueName: \"kubernetes.io/projected/1213cb1f-843a-4e6c-b56c-cf39c8108614-kube-api-access-nctdc\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.049267 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-dns-swift-storage-0\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.049300 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-ovsdbserver-sb\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.049333 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-ovsdbserver-nb\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.049364 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-config\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.053063 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-config\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.053312 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-dns-swift-storage-0\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.053328 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-ovsdbserver-nb\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.053513 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-dns-svc\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.053916 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-ovsdbserver-sb\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.066199 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0edbf62a-a3fc-416a-a94d-395da81b7b63" path="/var/lib/kubelet/pods/0edbf62a-a3fc-416a-a94d-395da81b7b63/volumes" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.066687 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38e4b9dc-4234-4602-9111-514d6d94e10b" path="/var/lib/kubelet/pods/38e4b9dc-4234-4602-9111-514d6d94e10b/volumes" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.067035 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46649dc4-4337-4378-a0a1-70b329141a22" path="/var/lib/kubelet/pods/46649dc4-4337-4378-a0a1-70b329141a22/volumes" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.067185 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nctdc\" (UniqueName: \"kubernetes.io/projected/1213cb1f-843a-4e6c-b56c-cf39c8108614-kube-api-access-nctdc\") pod \"dnsmasq-dns-6c654c9745-9ss87\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.067777 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6ffbb5d-hrkmt"] Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.069792 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.071696 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6ffbb5d-hrkmt"] Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.073375 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.073675 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-56mjh" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.073816 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.073946 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.151596 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-combined-ca-bundle\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.151693 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-httpd-config\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.151777 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2pkr\" (UniqueName: \"kubernetes.io/projected/80a0c11a-54b3-4823-86c1-8b48b37b46e4-kube-api-access-c2pkr\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.151845 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-config\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.152237 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-ovndb-tls-certs\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.214208 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.255165 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-httpd-config\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.255214 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2pkr\" (UniqueName: \"kubernetes.io/projected/80a0c11a-54b3-4823-86c1-8b48b37b46e4-kube-api-access-c2pkr\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.255270 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-config\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.255572 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-ovndb-tls-certs\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.255670 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-combined-ca-bundle\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.259209 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-combined-ca-bundle\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.259888 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-httpd-config\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.268183 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-ovndb-tls-certs\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: E1124 09:18:49.268304 4563 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:37d64e0a00c54e71a4c1fcbbbf7e832f6886ffd03c9a02b6ee3ca48fabc30879" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.268433 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-config\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: E1124 09:18:49.268516 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:37d64e0a00c54e71a4c1fcbbbf7e832f6886ffd03c9a02b6ee3ca48fabc30879,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wnr86,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-stk25_openstack(b0793e21-229f-415e-8b3e-1499e1ed3bf6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 09:18:49 crc kubenswrapper[4563]: E1124 09:18:49.269674 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-stk25" podUID="b0793e21-229f-415e-8b3e-1499e1ed3bf6" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.281009 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2pkr\" (UniqueName: \"kubernetes.io/projected/80a0c11a-54b3-4823-86c1-8b48b37b46e4-kube-api-access-c2pkr\") pod \"neutron-6ffbb5d-hrkmt\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.289994 4563 scope.go:117] "RemoveContainer" containerID="4c4adf3fda70bd0403946d2ea1fe2270168e48af824d9046727bde6eaa33062b" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.411305 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:49 crc kubenswrapper[4563]: E1124 09:18:49.716183 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:37d64e0a00c54e71a4c1fcbbbf7e832f6886ffd03c9a02b6ee3ca48fabc30879\\\"\"" pod="openstack/cinder-db-sync-stk25" podUID="b0793e21-229f-415e-8b3e-1499e1ed3bf6" Nov 24 09:18:49 crc kubenswrapper[4563]: E1124 09:18:49.728802 4563 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:d375d370be5ead0dac71109af644849e5795f535f9ad8eeacea261d77ae6f140" Nov 24 09:18:49 crc kubenswrapper[4563]: E1124 09:18:49.729022 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:d375d370be5ead0dac71109af644849e5795f535f9ad8eeacea261d77ae6f140,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n9bh74hf6h644h8bh5cdh657hffh5fhf5hffh689h57fhd7h86h5c7h56chcdh64dh95h558hfch54fh685h565h5c6h656h564h547h694h99hffq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-plvfd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(b995b5b3-41b0-4334-9f7c-792a50e780e7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.807040 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.842364 4563 scope.go:117] "RemoveContainer" containerID="e7636c149a4b5bd1098aa169a69ea8d47f767d6a95e9db54cf48fdd3ebc2b10b" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.928733 4563 scope.go:117] "RemoveContainer" containerID="e3ecef361d60e4158ee904d8c75861ca04dd27652ec80cabafd5896cf0859914" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.973587 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6fe7506-2be4-49fb-9295-bf5960b88baf-horizon-secret-key\") pod \"e6fe7506-2be4-49fb-9295-bf5960b88baf\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.973778 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6fe7506-2be4-49fb-9295-bf5960b88baf-scripts\") pod \"e6fe7506-2be4-49fb-9295-bf5960b88baf\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.973836 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6fe7506-2be4-49fb-9295-bf5960b88baf-logs\") pod \"e6fe7506-2be4-49fb-9295-bf5960b88baf\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.973884 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6fe7506-2be4-49fb-9295-bf5960b88baf-config-data\") pod \"e6fe7506-2be4-49fb-9295-bf5960b88baf\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.973932 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68lxl\" (UniqueName: \"kubernetes.io/projected/e6fe7506-2be4-49fb-9295-bf5960b88baf-kube-api-access-68lxl\") pod \"e6fe7506-2be4-49fb-9295-bf5960b88baf\" (UID: \"e6fe7506-2be4-49fb-9295-bf5960b88baf\") " Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.974778 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6fe7506-2be4-49fb-9295-bf5960b88baf-logs" (OuterVolumeSpecName: "logs") pod "e6fe7506-2be4-49fb-9295-bf5960b88baf" (UID: "e6fe7506-2be4-49fb-9295-bf5960b88baf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.975511 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6fe7506-2be4-49fb-9295-bf5960b88baf-scripts" (OuterVolumeSpecName: "scripts") pod "e6fe7506-2be4-49fb-9295-bf5960b88baf" (UID: "e6fe7506-2be4-49fb-9295-bf5960b88baf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.975736 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6fe7506-2be4-49fb-9295-bf5960b88baf-config-data" (OuterVolumeSpecName: "config-data") pod "e6fe7506-2be4-49fb-9295-bf5960b88baf" (UID: "e6fe7506-2be4-49fb-9295-bf5960b88baf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.983064 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fe7506-2be4-49fb-9295-bf5960b88baf-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e6fe7506-2be4-49fb-9295-bf5960b88baf" (UID: "e6fe7506-2be4-49fb-9295-bf5960b88baf"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:49 crc kubenswrapper[4563]: I1124 09:18:49.983592 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6fe7506-2be4-49fb-9295-bf5960b88baf-kube-api-access-68lxl" (OuterVolumeSpecName: "kube-api-access-68lxl") pod "e6fe7506-2be4-49fb-9295-bf5960b88baf" (UID: "e6fe7506-2be4-49fb-9295-bf5960b88baf"). InnerVolumeSpecName "kube-api-access-68lxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.078078 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6fe7506-2be4-49fb-9295-bf5960b88baf-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.078115 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6fe7506-2be4-49fb-9295-bf5960b88baf-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.078164 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6fe7506-2be4-49fb-9295-bf5960b88baf-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.078423 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68lxl\" (UniqueName: \"kubernetes.io/projected/e6fe7506-2be4-49fb-9295-bf5960b88baf-kube-api-access-68lxl\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.078435 4563 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6fe7506-2be4-49fb-9295-bf5960b88baf-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.332574 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cd5c59c66-hrmf5"] Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.344205 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jh96b"] Nov 24 09:18:50 crc kubenswrapper[4563]: W1124 09:18:50.347310 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ec5b651_57ef_414b_8c8e_4b488d71663f.slice/crio-3fabc59ee478fa8ffa39b7f3bb0c8faa2e304247a0963af93e364c189c2d2cb3 WatchSource:0}: Error finding container 3fabc59ee478fa8ffa39b7f3bb0c8faa2e304247a0963af93e364c189c2d2cb3: Status 404 returned error can't find the container with id 3fabc59ee478fa8ffa39b7f3bb0c8faa2e304247a0963af93e364c189c2d2cb3 Nov 24 09:18:50 crc kubenswrapper[4563]: W1124 09:18:50.353960 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod539b9102_6a58_4804_8b35_4b183ef45c82.slice/crio-faf13a4c9c1daa8796fed819ca4da0b09e691282f4ceb8021195f193fa768dbb WatchSource:0}: Error finding container faf13a4c9c1daa8796fed819ca4da0b09e691282f4ceb8021195f193fa768dbb: Status 404 returned error can't find the container with id faf13a4c9c1daa8796fed819ca4da0b09e691282f4ceb8021195f193fa768dbb Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.356276 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d999bbd6-cqj6s"] Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.362127 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.427755 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.510908 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r9lpl"] Nov 24 09:18:50 crc kubenswrapper[4563]: W1124 09:18:50.519822 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode26141fd_4cfa_4726_ba65_1f3bb830411b.slice/crio-747ebf866b20ae7ae17bf99a74acb6fdf6e69f7d20747967cc7d8599df780475 WatchSource:0}: Error finding container 747ebf866b20ae7ae17bf99a74acb6fdf6e69f7d20747967cc7d8599df780475: Status 404 returned error can't find the container with id 747ebf866b20ae7ae17bf99a74acb6fdf6e69f7d20747967cc7d8599df780475 Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.569601 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c654c9745-9ss87"] Nov 24 09:18:50 crc kubenswrapper[4563]: W1124 09:18:50.597986 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1213cb1f_843a_4e6c_b56c_cf39c8108614.slice/crio-88c4743ed51784676e5184b5c9648b4b28318da5ed0b067bd21042c1ef70fa1a WatchSource:0}: Error finding container 88c4743ed51784676e5184b5c9648b4b28318da5ed0b067bd21042c1ef70fa1a: Status 404 returned error can't find the container with id 88c4743ed51784676e5184b5c9648b4b28318da5ed0b067bd21042c1ef70fa1a Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.642985 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6ffbb5d-hrkmt"] Nov 24 09:18:50 crc kubenswrapper[4563]: W1124 09:18:50.652818 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80a0c11a_54b3_4823_86c1_8b48b37b46e4.slice/crio-1b76a76243dd63ee93f1a4747485921334c914e6cef40e436da94ec9143bc4c2 WatchSource:0}: Error finding container 1b76a76243dd63ee93f1a4747485921334c914e6cef40e436da94ec9143bc4c2: Status 404 returned error can't find the container with id 1b76a76243dd63ee93f1a4747485921334c914e6cef40e436da94ec9143bc4c2 Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.726977 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jh96b" event={"ID":"539b9102-6a58-4804-8b35-4b183ef45c82","Type":"ContainerStarted","Data":"ee740e0fa40e50447767b410027ff504c58fc3846d856cd5523dc356da7c8e76"} Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.727027 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jh96b" event={"ID":"539b9102-6a58-4804-8b35-4b183ef45c82","Type":"ContainerStarted","Data":"faf13a4c9c1daa8796fed819ca4da0b09e691282f4ceb8021195f193fa768dbb"} Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.730617 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6cead1d3-35c0-4274-be9c-d75115aedd8a","Type":"ContainerStarted","Data":"bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1"} Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.733875 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd5c59c66-hrmf5" event={"ID":"0ec5b651-57ef-414b-8c8e-4b488d71663f","Type":"ContainerStarted","Data":"3fabc59ee478fa8ffa39b7f3bb0c8faa2e304247a0963af93e364c189c2d2cb3"} Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.741996 4563 generic.go:334] "Generic (PLEG): container finished" podID="e26141fd-4cfa-4726-ba65-1f3bb830411b" containerID="7d851339d6225f1a0059e0abcec8c3925af72b82f67984b36ecbd263293e4f94" exitCode=0 Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.742046 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9lpl" event={"ID":"e26141fd-4cfa-4726-ba65-1f3bb830411b","Type":"ContainerDied","Data":"7d851339d6225f1a0059e0abcec8c3925af72b82f67984b36ecbd263293e4f94"} Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.742064 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9lpl" event={"ID":"e26141fd-4cfa-4726-ba65-1f3bb830411b","Type":"ContainerStarted","Data":"747ebf866b20ae7ae17bf99a74acb6fdf6e69f7d20747967cc7d8599df780475"} Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.743562 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jh96b" podStartSLOduration=18.743540358 podStartE2EDuration="18.743540358s" podCreationTimestamp="2025-11-24 09:18:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:50.739827705 +0000 UTC m=+907.998805143" watchObservedRunningTime="2025-11-24 09:18:50.743540358 +0000 UTC m=+908.002517795" Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.748326 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d999bbd6-cqj6s" event={"ID":"a7688cb4-70ea-43e4-85f2-6b96f972538f","Type":"ContainerStarted","Data":"8b300f4a55b5d2273437b58c8ecc202851d2752e9499a63b6c729ecd7ec712e7"} Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.750397 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"126732ea-0f6a-4fd6-9b5b-959e4da904fe","Type":"ContainerStarted","Data":"95f1baf97eab6396d024ca842b6f3774fd17f77904ecf3cc6df056faf116c8d3"} Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.762376 4563 generic.go:334] "Generic (PLEG): container finished" podID="571d80f9-80d0-4dae-bfa0-126c8055f9b0" containerID="6c4a123f90c7457838fb2c4f3802362d72bc951ffa13b466cdf2216ed6484ad1" exitCode=0 Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.762426 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rppmx" event={"ID":"571d80f9-80d0-4dae-bfa0-126c8055f9b0","Type":"ContainerDied","Data":"6c4a123f90c7457838fb2c4f3802362d72bc951ffa13b466cdf2216ed6484ad1"} Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.764137 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86c668996f-cnx27" event={"ID":"e6fe7506-2be4-49fb-9295-bf5960b88baf","Type":"ContainerDied","Data":"f819c1b90505fc6be56127383b06724f8fe7403b0cfad7e693e4595fe2b40ffe"} Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.764221 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86c668996f-cnx27" Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.766928 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffbb5d-hrkmt" event={"ID":"80a0c11a-54b3-4823-86c1-8b48b37b46e4","Type":"ContainerStarted","Data":"1b76a76243dd63ee93f1a4747485921334c914e6cef40e436da94ec9143bc4c2"} Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.767716 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c654c9745-9ss87" event={"ID":"1213cb1f-843a-4e6c-b56c-cf39c8108614","Type":"ContainerStarted","Data":"88c4743ed51784676e5184b5c9648b4b28318da5ed0b067bd21042c1ef70fa1a"} Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.837760 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86c668996f-cnx27"] Nov 24 09:18:50 crc kubenswrapper[4563]: I1124 09:18:50.847515 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-86c668996f-cnx27"] Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.065137 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6fe7506-2be4-49fb-9295-bf5960b88baf" path="/var/lib/kubelet/pods/e6fe7506-2be4-49fb-9295-bf5960b88baf/volumes" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.297694 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-85d84cd957-f2sp9"] Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.300958 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.303210 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.306172 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.311038 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-public-tls-certs\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.311109 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-httpd-config\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.311138 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-internal-tls-certs\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.311215 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-ovndb-tls-certs\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.311298 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-combined-ca-bundle\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.311426 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdvnl\" (UniqueName: \"kubernetes.io/projected/5c5b560e-1f0c-4469-8455-1aec5e7653bd-kube-api-access-tdvnl\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.311456 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-config\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.319739 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85d84cd957-f2sp9"] Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.412666 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-public-tls-certs\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.412946 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-httpd-config\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.412968 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-internal-tls-certs\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.413005 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-ovndb-tls-certs\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.413038 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-combined-ca-bundle\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.413087 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdvnl\" (UniqueName: \"kubernetes.io/projected/5c5b560e-1f0c-4469-8455-1aec5e7653bd-kube-api-access-tdvnl\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.413111 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-config\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.417381 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-public-tls-certs\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.418483 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-httpd-config\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.420144 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-config\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.420531 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-ovndb-tls-certs\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.421014 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-combined-ca-bundle\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.426090 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5b560e-1f0c-4469-8455-1aec5e7653bd-internal-tls-certs\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.427295 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdvnl\" (UniqueName: \"kubernetes.io/projected/5c5b560e-1f0c-4469-8455-1aec5e7653bd-kube-api-access-tdvnl\") pod \"neutron-85d84cd957-f2sp9\" (UID: \"5c5b560e-1f0c-4469-8455-1aec5e7653bd\") " pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.632285 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.779821 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d999bbd6-cqj6s" event={"ID":"a7688cb4-70ea-43e4-85f2-6b96f972538f","Type":"ContainerStarted","Data":"9b3b2cf768fabb5f23a04625abe02e994b12ca1362fe363db84b3f26322efc65"} Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.780042 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d999bbd6-cqj6s" event={"ID":"a7688cb4-70ea-43e4-85f2-6b96f972538f","Type":"ContainerStarted","Data":"2510326b8d13bae3285d98de5fce093409d77bbd3ed1699263608ba8446660a3"} Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.789013 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6cead1d3-35c0-4274-be9c-d75115aedd8a","Type":"ContainerStarted","Data":"4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b"} Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.789188 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6cead1d3-35c0-4274-be9c-d75115aedd8a" containerName="glance-log" containerID="cri-o://bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1" gracePeriod=30 Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.789302 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6cead1d3-35c0-4274-be9c-d75115aedd8a" containerName="glance-httpd" containerID="cri-o://4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b" gracePeriod=30 Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.797895 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd5c59c66-hrmf5" event={"ID":"0ec5b651-57ef-414b-8c8e-4b488d71663f","Type":"ContainerStarted","Data":"580a67c88349862289ccfdb112589708d48749b86e526a80925a6ba1dc67dab0"} Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.797950 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd5c59c66-hrmf5" event={"ID":"0ec5b651-57ef-414b-8c8e-4b488d71663f","Type":"ContainerStarted","Data":"babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209"} Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.806047 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d999bbd6-cqj6s" podStartSLOduration=17.111084023 podStartE2EDuration="17.806038369s" podCreationTimestamp="2025-11-24 09:18:34 +0000 UTC" firstStartedPulling="2025-11-24 09:18:50.369801789 +0000 UTC m=+907.628779236" lastFinishedPulling="2025-11-24 09:18:51.064756135 +0000 UTC m=+908.323733582" observedRunningTime="2025-11-24 09:18:51.803493439 +0000 UTC m=+909.062470885" watchObservedRunningTime="2025-11-24 09:18:51.806038369 +0000 UTC m=+909.065015816" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.824753 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b995b5b3-41b0-4334-9f7c-792a50e780e7","Type":"ContainerStarted","Data":"714c30487c4e7dfc7a7cf3f6168922dc32da768f7e7dbe76847800c55047ae48"} Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.845620 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"126732ea-0f6a-4fd6-9b5b-959e4da904fe","Type":"ContainerStarted","Data":"123de7ce0128b04a34dc9c98f1bd7fba7f887c0a9e1265cd172ebac4b29c79f6"} Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.849798 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"126732ea-0f6a-4fd6-9b5b-959e4da904fe","Type":"ContainerStarted","Data":"1e2e3f0bbd509f6376d01716104cf491690003d78ec1df0ea20d767237402289"} Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.855153 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffbb5d-hrkmt" event={"ID":"80a0c11a-54b3-4823-86c1-8b48b37b46e4","Type":"ContainerStarted","Data":"f49b10e8d89be654f65a1d0c8844563f6bd522c3bef7627f789f59fafff60484"} Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.855283 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffbb5d-hrkmt" event={"ID":"80a0c11a-54b3-4823-86c1-8b48b37b46e4","Type":"ContainerStarted","Data":"6d3a5c2335736b9cafe2fc177ead532e75faac165ee72181e066a3aeeb0e1aff"} Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.855572 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.876092 4563 generic.go:334] "Generic (PLEG): container finished" podID="1213cb1f-843a-4e6c-b56c-cf39c8108614" containerID="2d3d1a70d147bfdd5f33d5722dc44f45ebce5578c7d57d9e346a08f1e4e200bd" exitCode=0 Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.877933 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c654c9745-9ss87" event={"ID":"1213cb1f-843a-4e6c-b56c-cf39c8108614","Type":"ContainerDied","Data":"2d3d1a70d147bfdd5f33d5722dc44f45ebce5578c7d57d9e346a08f1e4e200bd"} Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.883618 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=19.883590832 podStartE2EDuration="19.883590832s" podCreationTimestamp="2025-11-24 09:18:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:51.829029275 +0000 UTC m=+909.088006722" watchObservedRunningTime="2025-11-24 09:18:51.883590832 +0000 UTC m=+909.142568280" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.907441 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7cd5c59c66-hrmf5" podStartSLOduration=17.173741681 podStartE2EDuration="17.907427325s" podCreationTimestamp="2025-11-24 09:18:34 +0000 UTC" firstStartedPulling="2025-11-24 09:18:50.354044477 +0000 UTC m=+907.613021923" lastFinishedPulling="2025-11-24 09:18:51.087730121 +0000 UTC m=+908.346707567" observedRunningTime="2025-11-24 09:18:51.84640485 +0000 UTC m=+909.105382297" watchObservedRunningTime="2025-11-24 09:18:51.907427325 +0000 UTC m=+909.166404772" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.932681 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=16.932668286 podStartE2EDuration="16.932668286s" podCreationTimestamp="2025-11-24 09:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:51.876334896 +0000 UTC m=+909.135312343" watchObservedRunningTime="2025-11-24 09:18:51.932668286 +0000 UTC m=+909.191645733" Nov 24 09:18:51 crc kubenswrapper[4563]: I1124 09:18:51.938205 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6ffbb5d-hrkmt" podStartSLOduration=2.93819449 podStartE2EDuration="2.93819449s" podCreationTimestamp="2025-11-24 09:18:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:51.891222938 +0000 UTC m=+909.150200386" watchObservedRunningTime="2025-11-24 09:18:51.93819449 +0000 UTC m=+909.197171937" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.206182 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85d84cd957-f2sp9"] Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.549001 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.680540 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"6cead1d3-35c0-4274-be9c-d75115aedd8a\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.680587 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-scripts\") pod \"6cead1d3-35c0-4274-be9c-d75115aedd8a\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.680666 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-public-tls-certs\") pod \"6cead1d3-35c0-4274-be9c-d75115aedd8a\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.680730 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-config-data\") pod \"6cead1d3-35c0-4274-be9c-d75115aedd8a\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.680755 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cead1d3-35c0-4274-be9c-d75115aedd8a-logs\") pod \"6cead1d3-35c0-4274-be9c-d75115aedd8a\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.680811 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-combined-ca-bundle\") pod \"6cead1d3-35c0-4274-be9c-d75115aedd8a\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.680848 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5tcc\" (UniqueName: \"kubernetes.io/projected/6cead1d3-35c0-4274-be9c-d75115aedd8a-kube-api-access-d5tcc\") pod \"6cead1d3-35c0-4274-be9c-d75115aedd8a\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.680892 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cead1d3-35c0-4274-be9c-d75115aedd8a-httpd-run\") pod \"6cead1d3-35c0-4274-be9c-d75115aedd8a\" (UID: \"6cead1d3-35c0-4274-be9c-d75115aedd8a\") " Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.681651 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cead1d3-35c0-4274-be9c-d75115aedd8a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6cead1d3-35c0-4274-be9c-d75115aedd8a" (UID: "6cead1d3-35c0-4274-be9c-d75115aedd8a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.681891 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cead1d3-35c0-4274-be9c-d75115aedd8a-logs" (OuterVolumeSpecName: "logs") pod "6cead1d3-35c0-4274-be9c-d75115aedd8a" (UID: "6cead1d3-35c0-4274-be9c-d75115aedd8a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.684826 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "6cead1d3-35c0-4274-be9c-d75115aedd8a" (UID: "6cead1d3-35c0-4274-be9c-d75115aedd8a"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.685201 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-scripts" (OuterVolumeSpecName: "scripts") pod "6cead1d3-35c0-4274-be9c-d75115aedd8a" (UID: "6cead1d3-35c0-4274-be9c-d75115aedd8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.690760 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cead1d3-35c0-4274-be9c-d75115aedd8a-kube-api-access-d5tcc" (OuterVolumeSpecName: "kube-api-access-d5tcc") pod "6cead1d3-35c0-4274-be9c-d75115aedd8a" (UID: "6cead1d3-35c0-4274-be9c-d75115aedd8a"). InnerVolumeSpecName "kube-api-access-d5tcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.738833 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cead1d3-35c0-4274-be9c-d75115aedd8a" (UID: "6cead1d3-35c0-4274-be9c-d75115aedd8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.772063 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-config-data" (OuterVolumeSpecName: "config-data") pod "6cead1d3-35c0-4274-be9c-d75115aedd8a" (UID: "6cead1d3-35c0-4274-be9c-d75115aedd8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.786267 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.786298 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.786313 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cead1d3-35c0-4274-be9c-d75115aedd8a-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.786323 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.786334 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5tcc\" (UniqueName: \"kubernetes.io/projected/6cead1d3-35c0-4274-be9c-d75115aedd8a-kube-api-access-d5tcc\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.786343 4563 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cead1d3-35c0-4274-be9c-d75115aedd8a-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.786374 4563 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.802687 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6cead1d3-35c0-4274-be9c-d75115aedd8a" (UID: "6cead1d3-35c0-4274-be9c-d75115aedd8a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.807516 4563 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.888499 4563 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.888550 4563 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cead1d3-35c0-4274-be9c-d75115aedd8a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.924880 4563 generic.go:334] "Generic (PLEG): container finished" podID="6cead1d3-35c0-4274-be9c-d75115aedd8a" containerID="4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b" exitCode=0 Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.924940 4563 generic.go:334] "Generic (PLEG): container finished" podID="6cead1d3-35c0-4274-be9c-d75115aedd8a" containerID="bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1" exitCode=143 Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.925029 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6cead1d3-35c0-4274-be9c-d75115aedd8a","Type":"ContainerDied","Data":"4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b"} Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.925068 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6cead1d3-35c0-4274-be9c-d75115aedd8a","Type":"ContainerDied","Data":"bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1"} Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.925085 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6cead1d3-35c0-4274-be9c-d75115aedd8a","Type":"ContainerDied","Data":"144eee9f484b95119b714357ff67b9c8e69e0585b73380ad334091e47571516d"} Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.925106 4563 scope.go:117] "RemoveContainer" containerID="4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.925322 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.941501 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85d84cd957-f2sp9" event={"ID":"5c5b560e-1f0c-4469-8455-1aec5e7653bd","Type":"ContainerStarted","Data":"1ed9635c292ad3d4d0c84f412c7078b8627393a683beb28c58c96ae72122d349"} Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.941558 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85d84cd957-f2sp9" event={"ID":"5c5b560e-1f0c-4469-8455-1aec5e7653bd","Type":"ContainerStarted","Data":"b722c4b31ad9358bd2c8a06467429abfebb0771ac274376f51bcd2fc2610ba7f"} Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.941569 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85d84cd957-f2sp9" event={"ID":"5c5b560e-1f0c-4469-8455-1aec5e7653bd","Type":"ContainerStarted","Data":"cf13ddbce641c71f7b21b3b7cb7b0ebafdefd7f14ad794c84e8c39393a9e7ec2"} Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.942059 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.955284 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.958927 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c654c9745-9ss87" event={"ID":"1213cb1f-843a-4e6c-b56c-cf39c8108614","Type":"ContainerStarted","Data":"0439d1adb4816d7c44e31ae913ed92ed69ad6ef565afbc090fd1aa5fea29fa3f"} Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.958996 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.959733 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.963355 4563 generic.go:334] "Generic (PLEG): container finished" podID="571d80f9-80d0-4dae-bfa0-126c8055f9b0" containerID="7ef7f9add08a8ee7aa538215f14ed78667ffa8fcd949150630626fd6be2c4a1f" exitCode=0 Nov 24 09:18:52 crc kubenswrapper[4563]: I1124 09:18:52.964469 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rppmx" event={"ID":"571d80f9-80d0-4dae-bfa0-126c8055f9b0","Type":"ContainerDied","Data":"7ef7f9add08a8ee7aa538215f14ed78667ffa8fcd949150630626fd6be2c4a1f"} Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.001743 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-85d84cd957-f2sp9" podStartSLOduration=1.999705348 podStartE2EDuration="1.999705348s" podCreationTimestamp="2025-11-24 09:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:52.980006751 +0000 UTC m=+910.238984198" watchObservedRunningTime="2025-11-24 09:18:52.999705348 +0000 UTC m=+910.258682795" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.017578 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.030577 4563 scope.go:117] "RemoveContainer" containerID="bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1" Nov 24 09:18:53 crc kubenswrapper[4563]: E1124 09:18:53.032084 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cead1d3-35c0-4274-be9c-d75115aedd8a" containerName="glance-httpd" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.032109 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cead1d3-35c0-4274-be9c-d75115aedd8a" containerName="glance-httpd" Nov 24 09:18:53 crc kubenswrapper[4563]: E1124 09:18:53.032128 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cead1d3-35c0-4274-be9c-d75115aedd8a" containerName="glance-log" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.032135 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cead1d3-35c0-4274-be9c-d75115aedd8a" containerName="glance-log" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.032372 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cead1d3-35c0-4274-be9c-d75115aedd8a" containerName="glance-log" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.032399 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cead1d3-35c0-4274-be9c-d75115aedd8a" containerName="glance-httpd" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.033396 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.033503 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.038489 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.038689 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.039160 4563 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod9afed3b6-6377-4699-8ed3-ffbcff7b1c13"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod9afed3b6-6377-4699-8ed3-ffbcff7b1c13] : Timed out while waiting for systemd to remove kubepods-besteffort-pod9afed3b6_6377_4699_8ed3_ffbcff7b1c13.slice" Nov 24 09:18:53 crc kubenswrapper[4563]: E1124 09:18:53.039194 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod9afed3b6-6377-4699-8ed3-ffbcff7b1c13] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod9afed3b6-6377-4699-8ed3-ffbcff7b1c13] : Timed out while waiting for systemd to remove kubepods-besteffort-pod9afed3b6_6377_4699_8ed3_ffbcff7b1c13.slice" pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" podUID="9afed3b6-6377-4699-8ed3-ffbcff7b1c13" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.097193 4563 scope.go:117] "RemoveContainer" containerID="4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b" Nov 24 09:18:53 crc kubenswrapper[4563]: E1124 09:18:53.097894 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b\": container with ID starting with 4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b not found: ID does not exist" containerID="4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.097952 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b"} err="failed to get container status \"4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b\": rpc error: code = NotFound desc = could not find container \"4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b\": container with ID starting with 4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b not found: ID does not exist" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.097984 4563 scope.go:117] "RemoveContainer" containerID="bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1" Nov 24 09:18:53 crc kubenswrapper[4563]: E1124 09:18:53.098564 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1\": container with ID starting with bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1 not found: ID does not exist" containerID="bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.098683 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1"} err="failed to get container status \"bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1\": rpc error: code = NotFound desc = could not find container \"bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1\": container with ID starting with bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1 not found: ID does not exist" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.098718 4563 scope.go:117] "RemoveContainer" containerID="4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.099095 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b"} err="failed to get container status \"4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b\": rpc error: code = NotFound desc = could not find container \"4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b\": container with ID starting with 4421a96d4b49d4cc22471fa9120294500a65bee9a780313ebe9ef9e65c32a26b not found: ID does not exist" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.099128 4563 scope.go:117] "RemoveContainer" containerID="bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.099783 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1"} err="failed to get container status \"bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1\": rpc error: code = NotFound desc = could not find container \"bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1\": container with ID starting with bf5eb1e1a4293cfaf8c4c4b49433b2476d085f2ece188a65e8af01a1e5c11ef1 not found: ID does not exist" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.124554 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c654c9745-9ss87" podStartSLOduration=5.124529369 podStartE2EDuration="5.124529369s" podCreationTimestamp="2025-11-24 09:18:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:53.038135337 +0000 UTC m=+910.297112784" watchObservedRunningTime="2025-11-24 09:18:53.124529369 +0000 UTC m=+910.383506816" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.126798 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cead1d3-35c0-4274-be9c-d75115aedd8a" path="/var/lib/kubelet/pods/6cead1d3-35c0-4274-be9c-d75115aedd8a/volumes" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.202219 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tjtv\" (UniqueName: \"kubernetes.io/projected/ce3468f6-f565-41d5-ad15-302e10230479-kube-api-access-4tjtv\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.202268 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.202320 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce3468f6-f565-41d5-ad15-302e10230479-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.202336 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.202449 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.202588 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-config-data\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.202675 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-scripts\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.202910 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce3468f6-f565-41d5-ad15-302e10230479-logs\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.304868 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-config-data\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.305209 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-scripts\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.305312 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce3468f6-f565-41d5-ad15-302e10230479-logs\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.305486 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tjtv\" (UniqueName: \"kubernetes.io/projected/ce3468f6-f565-41d5-ad15-302e10230479-kube-api-access-4tjtv\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.305567 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.305703 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.305785 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce3468f6-f565-41d5-ad15-302e10230479-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.305918 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.306199 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce3468f6-f565-41d5-ad15-302e10230479-logs\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.306505 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.306771 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce3468f6-f565-41d5-ad15-302e10230479-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.312987 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-scripts\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.315002 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.321119 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tjtv\" (UniqueName: \"kubernetes.io/projected/ce3468f6-f565-41d5-ad15-302e10230479-kube-api-access-4tjtv\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.324945 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.325292 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-config-data\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.382787 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.395789 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.971838 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:18:53 crc kubenswrapper[4563]: W1124 09:18:53.982250 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce3468f6_f565_41d5_ad15_302e10230479.slice/crio-4945edf98dff45060f13b65653f7eded9f0af294bf9e293bc7180c54927d3059 WatchSource:0}: Error finding container 4945edf98dff45060f13b65653f7eded9f0af294bf9e293bc7180c54927d3059: Status 404 returned error can't find the container with id 4945edf98dff45060f13b65653f7eded9f0af294bf9e293bc7180c54927d3059 Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.999789 4563 generic.go:334] "Generic (PLEG): container finished" podID="539b9102-6a58-4804-8b35-4b183ef45c82" containerID="ee740e0fa40e50447767b410027ff504c58fc3846d856cd5523dc356da7c8e76" exitCode=0 Nov 24 09:18:53 crc kubenswrapper[4563]: I1124 09:18:53.999875 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jh96b" event={"ID":"539b9102-6a58-4804-8b35-4b183ef45c82","Type":"ContainerDied","Data":"ee740e0fa40e50447767b410027ff504c58fc3846d856cd5523dc356da7c8e76"} Nov 24 09:18:54 crc kubenswrapper[4563]: I1124 09:18:54.022846 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rppmx" event={"ID":"571d80f9-80d0-4dae-bfa0-126c8055f9b0","Type":"ContainerStarted","Data":"b8394dbe6dc07d3f781e10185db336446fa29c2605c94506c32c2fb21e3d0c10"} Nov 24 09:18:54 crc kubenswrapper[4563]: I1124 09:18:54.023308 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ddf475bf-48vsf" Nov 24 09:18:54 crc kubenswrapper[4563]: I1124 09:18:54.087423 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rppmx" podStartSLOduration=20.536869385 podStartE2EDuration="23.08740186s" podCreationTimestamp="2025-11-24 09:18:31 +0000 UTC" firstStartedPulling="2025-11-24 09:18:51.022916975 +0000 UTC m=+908.281894423" lastFinishedPulling="2025-11-24 09:18:53.57344945 +0000 UTC m=+910.832426898" observedRunningTime="2025-11-24 09:18:54.046982698 +0000 UTC m=+911.305960146" watchObservedRunningTime="2025-11-24 09:18:54.08740186 +0000 UTC m=+911.346379307" Nov 24 09:18:54 crc kubenswrapper[4563]: I1124 09:18:54.115140 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84ddf475bf-48vsf"] Nov 24 09:18:54 crc kubenswrapper[4563]: I1124 09:18:54.120527 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84ddf475bf-48vsf"] Nov 24 09:18:54 crc kubenswrapper[4563]: I1124 09:18:54.648749 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:54 crc kubenswrapper[4563]: I1124 09:18:54.649128 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:18:54 crc kubenswrapper[4563]: I1124 09:18:54.847341 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:54 crc kubenswrapper[4563]: I1124 09:18:54.847701 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:18:55 crc kubenswrapper[4563]: I1124 09:18:55.081372 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9afed3b6-6377-4699-8ed3-ffbcff7b1c13" path="/var/lib/kubelet/pods/9afed3b6-6377-4699-8ed3-ffbcff7b1c13/volumes" Nov 24 09:18:55 crc kubenswrapper[4563]: I1124 09:18:55.081819 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce3468f6-f565-41d5-ad15-302e10230479","Type":"ContainerStarted","Data":"e4972f237565195a76103efe1025e783d7e0d1366c6f017de9154a12791baa96"} Nov 24 09:18:55 crc kubenswrapper[4563]: I1124 09:18:55.081843 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce3468f6-f565-41d5-ad15-302e10230479","Type":"ContainerStarted","Data":"4945edf98dff45060f13b65653f7eded9f0af294bf9e293bc7180c54927d3059"} Nov 24 09:18:55 crc kubenswrapper[4563]: I1124 09:18:55.975537 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:55 crc kubenswrapper[4563]: I1124 09:18:55.975812 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:56 crc kubenswrapper[4563]: I1124 09:18:56.003587 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:56 crc kubenswrapper[4563]: I1124 09:18:56.011037 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:56 crc kubenswrapper[4563]: I1124 09:18:56.088691 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce3468f6-f565-41d5-ad15-302e10230479","Type":"ContainerStarted","Data":"3304aad3aaefb9843dcbd7d53d407c72fbd21eb60a457f9eb5ea04521218050e"} Nov 24 09:18:56 crc kubenswrapper[4563]: I1124 09:18:56.089022 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:56 crc kubenswrapper[4563]: I1124 09:18:56.089043 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:56 crc kubenswrapper[4563]: I1124 09:18:56.110813 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.110793746 podStartE2EDuration="4.110793746s" podCreationTimestamp="2025-11-24 09:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:18:56.102781304 +0000 UTC m=+913.361758752" watchObservedRunningTime="2025-11-24 09:18:56.110793746 +0000 UTC m=+913.369771194" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.113616 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jh96b" event={"ID":"539b9102-6a58-4804-8b35-4b183ef45c82","Type":"ContainerDied","Data":"faf13a4c9c1daa8796fed819ca4da0b09e691282f4ceb8021195f193fa768dbb"} Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.113697 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faf13a4c9c1daa8796fed819ca4da0b09e691282f4ceb8021195f193fa768dbb" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.143734 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.213011 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-credential-keys\") pod \"539b9102-6a58-4804-8b35-4b183ef45c82\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.213098 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-config-data\") pod \"539b9102-6a58-4804-8b35-4b183ef45c82\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.213170 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-scripts\") pod \"539b9102-6a58-4804-8b35-4b183ef45c82\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.213338 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-combined-ca-bundle\") pod \"539b9102-6a58-4804-8b35-4b183ef45c82\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.213477 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-fernet-keys\") pod \"539b9102-6a58-4804-8b35-4b183ef45c82\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.213911 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlv86\" (UniqueName: \"kubernetes.io/projected/539b9102-6a58-4804-8b35-4b183ef45c82-kube-api-access-nlv86\") pod \"539b9102-6a58-4804-8b35-4b183ef45c82\" (UID: \"539b9102-6a58-4804-8b35-4b183ef45c82\") " Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.218295 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/539b9102-6a58-4804-8b35-4b183ef45c82-kube-api-access-nlv86" (OuterVolumeSpecName: "kube-api-access-nlv86") pod "539b9102-6a58-4804-8b35-4b183ef45c82" (UID: "539b9102-6a58-4804-8b35-4b183ef45c82"). InnerVolumeSpecName "kube-api-access-nlv86". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.219263 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-scripts" (OuterVolumeSpecName: "scripts") pod "539b9102-6a58-4804-8b35-4b183ef45c82" (UID: "539b9102-6a58-4804-8b35-4b183ef45c82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.220190 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "539b9102-6a58-4804-8b35-4b183ef45c82" (UID: "539b9102-6a58-4804-8b35-4b183ef45c82"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.244130 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "539b9102-6a58-4804-8b35-4b183ef45c82" (UID: "539b9102-6a58-4804-8b35-4b183ef45c82"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.244857 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "539b9102-6a58-4804-8b35-4b183ef45c82" (UID: "539b9102-6a58-4804-8b35-4b183ef45c82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.246261 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-config-data" (OuterVolumeSpecName: "config-data") pod "539b9102-6a58-4804-8b35-4b183ef45c82" (UID: "539b9102-6a58-4804-8b35-4b183ef45c82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.317288 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.317319 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.317329 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.317339 4563 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.317348 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlv86\" (UniqueName: \"kubernetes.io/projected/539b9102-6a58-4804-8b35-4b183ef45c82-kube-api-access-nlv86\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.317368 4563 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/539b9102-6a58-4804-8b35-4b183ef45c82-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.739982 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:57 crc kubenswrapper[4563]: I1124 09:18:57.740170 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.122118 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jh96b" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.228932 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6565cb8596-rhwtd"] Nov 24 09:18:58 crc kubenswrapper[4563]: E1124 09:18:58.229237 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="539b9102-6a58-4804-8b35-4b183ef45c82" containerName="keystone-bootstrap" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.229254 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="539b9102-6a58-4804-8b35-4b183ef45c82" containerName="keystone-bootstrap" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.229437 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="539b9102-6a58-4804-8b35-4b183ef45c82" containerName="keystone-bootstrap" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.229960 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.237225 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.238999 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.239256 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.239337 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.239392 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.242455 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sw8dk" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.254689 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6565cb8596-rhwtd"] Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.335392 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-internal-tls-certs\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.335479 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj4kh\" (UniqueName: \"kubernetes.io/projected/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-kube-api-access-wj4kh\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.335522 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-fernet-keys\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.335550 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-public-tls-certs\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.335568 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-combined-ca-bundle\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.335590 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-scripts\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.335604 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-config-data\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.335628 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-credential-keys\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.436599 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-fernet-keys\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.436659 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-public-tls-certs\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.436686 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-combined-ca-bundle\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.436721 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-scripts\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.436741 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-config-data\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.436768 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-credential-keys\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.436803 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-internal-tls-certs\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.436865 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj4kh\" (UniqueName: \"kubernetes.io/projected/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-kube-api-access-wj4kh\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.441905 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-scripts\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.442291 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-credential-keys\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.445325 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-fernet-keys\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.445519 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-internal-tls-certs\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.446165 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-config-data\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.447837 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-public-tls-certs\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.448169 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-combined-ca-bundle\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.455171 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj4kh\" (UniqueName: \"kubernetes.io/projected/d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1-kube-api-access-wj4kh\") pod \"keystone-6565cb8596-rhwtd\" (UID: \"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1\") " pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:58 crc kubenswrapper[4563]: I1124 09:18:58.545337 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:18:59 crc kubenswrapper[4563]: I1124 09:18:59.216727 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:18:59 crc kubenswrapper[4563]: I1124 09:18:59.272762 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c58b6d97-qn6g9"] Nov 24 09:18:59 crc kubenswrapper[4563]: I1124 09:18:59.274032 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" podUID="f2330283-b2b3-4fe5-9924-bc2f48c08497" containerName="dnsmasq-dns" containerID="cri-o://3dcb35494414940ca18de8743344ae135132b027cc72bd50e99f35ca4180e38b" gracePeriod=10 Nov 24 09:19:00 crc kubenswrapper[4563]: I1124 09:19:00.156304 4563 generic.go:334] "Generic (PLEG): container finished" podID="f2330283-b2b3-4fe5-9924-bc2f48c08497" containerID="3dcb35494414940ca18de8743344ae135132b027cc72bd50e99f35ca4180e38b" exitCode=0 Nov 24 09:19:00 crc kubenswrapper[4563]: I1124 09:19:00.156349 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" event={"ID":"f2330283-b2b3-4fe5-9924-bc2f48c08497","Type":"ContainerDied","Data":"3dcb35494414940ca18de8743344ae135132b027cc72bd50e99f35ca4180e38b"} Nov 24 09:19:00 crc kubenswrapper[4563]: I1124 09:19:00.473986 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" podUID="f2330283-b2b3-4fe5-9924-bc2f48c08497" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: connect: connection refused" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.303201 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.303519 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.364749 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.448234 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.510606 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-config\") pod \"f2330283-b2b3-4fe5-9924-bc2f48c08497\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.510678 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-dns-svc\") pod \"f2330283-b2b3-4fe5-9924-bc2f48c08497\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.510723 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhm69\" (UniqueName: \"kubernetes.io/projected/f2330283-b2b3-4fe5-9924-bc2f48c08497-kube-api-access-dhm69\") pod \"f2330283-b2b3-4fe5-9924-bc2f48c08497\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.510846 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-dns-swift-storage-0\") pod \"f2330283-b2b3-4fe5-9924-bc2f48c08497\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.510895 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-ovsdbserver-nb\") pod \"f2330283-b2b3-4fe5-9924-bc2f48c08497\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.510925 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-ovsdbserver-sb\") pod \"f2330283-b2b3-4fe5-9924-bc2f48c08497\" (UID: \"f2330283-b2b3-4fe5-9924-bc2f48c08497\") " Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.515998 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2330283-b2b3-4fe5-9924-bc2f48c08497-kube-api-access-dhm69" (OuterVolumeSpecName: "kube-api-access-dhm69") pod "f2330283-b2b3-4fe5-9924-bc2f48c08497" (UID: "f2330283-b2b3-4fe5-9924-bc2f48c08497"). InnerVolumeSpecName "kube-api-access-dhm69". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.546211 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f2330283-b2b3-4fe5-9924-bc2f48c08497" (UID: "f2330283-b2b3-4fe5-9924-bc2f48c08497"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.547128 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f2330283-b2b3-4fe5-9924-bc2f48c08497" (UID: "f2330283-b2b3-4fe5-9924-bc2f48c08497"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.548048 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f2330283-b2b3-4fe5-9924-bc2f48c08497" (UID: "f2330283-b2b3-4fe5-9924-bc2f48c08497"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.550952 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-config" (OuterVolumeSpecName: "config") pod "f2330283-b2b3-4fe5-9924-bc2f48c08497" (UID: "f2330283-b2b3-4fe5-9924-bc2f48c08497"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.551560 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f2330283-b2b3-4fe5-9924-bc2f48c08497" (UID: "f2330283-b2b3-4fe5-9924-bc2f48c08497"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.614841 4563 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.614873 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.614883 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.614894 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.614906 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2330283-b2b3-4fe5-9924-bc2f48c08497-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.614914 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhm69\" (UniqueName: \"kubernetes.io/projected/f2330283-b2b3-4fe5-9924-bc2f48c08497-kube-api-access-dhm69\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:02 crc kubenswrapper[4563]: I1124 09:19:02.648677 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6565cb8596-rhwtd"] Nov 24 09:19:03 crc kubenswrapper[4563]: E1124 09:19:03.021036 4563 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode26141fd_4cfa_4726_ba65_1f3bb830411b.slice/crio-eb2d2a98b7536381da20481d24c6608674b2f2f96bcb66bedf0cae985b0ca428.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode26141fd_4cfa_4726_ba65_1f3bb830411b.slice/crio-conmon-eb2d2a98b7536381da20481d24c6608674b2f2f96bcb66bedf0cae985b0ca428.scope\": RecentStats: unable to find data in memory cache]" Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.196691 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-frjwb" event={"ID":"ed99b33d-c985-44bf-9a4a-b9f93bf3927f","Type":"ContainerStarted","Data":"a75fb20e0f9fb563e540d2e760669d37d93203806b15615e19c69e43dc7709ed"} Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.200110 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b995b5b3-41b0-4334-9f7c-792a50e780e7","Type":"ContainerStarted","Data":"3991af0b0f902f2e913b758214f68941b05dd8ebc1d3590242e7b910416be4c7"} Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.205784 4563 generic.go:334] "Generic (PLEG): container finished" podID="e26141fd-4cfa-4726-ba65-1f3bb830411b" containerID="eb2d2a98b7536381da20481d24c6608674b2f2f96bcb66bedf0cae985b0ca428" exitCode=0 Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.205890 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9lpl" event={"ID":"e26141fd-4cfa-4726-ba65-1f3bb830411b","Type":"ContainerDied","Data":"eb2d2a98b7536381da20481d24c6608674b2f2f96bcb66bedf0cae985b0ca428"} Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.209534 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" event={"ID":"f2330283-b2b3-4fe5-9924-bc2f48c08497","Type":"ContainerDied","Data":"93264d69bdf6a0760e58017a2f90b805fc41f30c35f45fb27de38d97c191cc88"} Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.209592 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c58b6d97-qn6g9" Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.209600 4563 scope.go:117] "RemoveContainer" containerID="3dcb35494414940ca18de8743344ae135132b027cc72bd50e99f35ca4180e38b" Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.213979 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6565cb8596-rhwtd" event={"ID":"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1","Type":"ContainerStarted","Data":"1ac6a8dd1155fcf9b8cbf811b2701dafb42e8a7094a6d950fc11580bab2b6424"} Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.214051 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6565cb8596-rhwtd" event={"ID":"d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1","Type":"ContainerStarted","Data":"1bd5f112361da4607f894785bd2ae627a08a96690c492d6bb360f5b63d90eaeb"} Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.218769 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-frjwb" podStartSLOduration=2.503515321 podStartE2EDuration="39.218749001s" podCreationTimestamp="2025-11-24 09:18:24 +0000 UTC" firstStartedPulling="2025-11-24 09:18:25.874165636 +0000 UTC m=+883.133143082" lastFinishedPulling="2025-11-24 09:19:02.589399315 +0000 UTC m=+919.848376762" observedRunningTime="2025-11-24 09:19:03.211816386 +0000 UTC m=+920.470793834" watchObservedRunningTime="2025-11-24 09:19:03.218749001 +0000 UTC m=+920.477726448" Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.260275 4563 scope.go:117] "RemoveContainer" containerID="cd9a2e6efb8b95d3e0180b68415a21edfdfb8771f9b1c7b43b95815c58884285" Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.285198 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6565cb8596-rhwtd" podStartSLOduration=5.285184529 podStartE2EDuration="5.285184529s" podCreationTimestamp="2025-11-24 09:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:19:03.260007555 +0000 UTC m=+920.518985002" watchObservedRunningTime="2025-11-24 09:19:03.285184529 +0000 UTC m=+920.544161976" Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.290025 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.290705 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c58b6d97-qn6g9"] Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.299826 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76c58b6d97-qn6g9"] Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.396531 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.397622 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.427253 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 24 09:19:03 crc kubenswrapper[4563]: I1124 09:19:03.433228 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 24 09:19:04 crc kubenswrapper[4563]: I1124 09:19:04.162230 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rppmx"] Nov 24 09:19:04 crc kubenswrapper[4563]: I1124 09:19:04.224807 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:19:04 crc kubenswrapper[4563]: I1124 09:19:04.225472 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 24 09:19:04 crc kubenswrapper[4563]: I1124 09:19:04.225490 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 24 09:19:04 crc kubenswrapper[4563]: I1124 09:19:04.650299 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7cd5c59c66-hrmf5" podUID="0ec5b651-57ef-414b-8c8e-4b488d71663f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Nov 24 09:19:04 crc kubenswrapper[4563]: I1124 09:19:04.851629 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6d999bbd6-cqj6s" podUID="a7688cb4-70ea-43e4-85f2-6b96f972538f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.065473 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2330283-b2b3-4fe5-9924-bc2f48c08497" path="/var/lib/kubelet/pods/f2330283-b2b3-4fe5-9924-bc2f48c08497/volumes" Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.235621 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r9lpl" event={"ID":"e26141fd-4cfa-4726-ba65-1f3bb830411b","Type":"ContainerStarted","Data":"62607a8cd63158b9f85c8b488d94ea4f517a243c748ec1e1b25689867fbf9697"} Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.245813 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8klw7" event={"ID":"d803d6ca-646a-4dd5-93ef-d096b501c28a","Type":"ContainerStarted","Data":"1e68b8254c01f5945197761aa93a0a3825f41da8f8a75fadc6d0a90987706bf4"} Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.246753 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rppmx" podUID="571d80f9-80d0-4dae-bfa0-126c8055f9b0" containerName="registry-server" containerID="cri-o://b8394dbe6dc07d3f781e10185db336446fa29c2605c94506c32c2fb21e3d0c10" gracePeriod=2 Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.259522 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r9lpl" podStartSLOduration=16.840498729 podStartE2EDuration="30.259508403s" podCreationTimestamp="2025-11-24 09:18:35 +0000 UTC" firstStartedPulling="2025-11-24 09:18:51.022949117 +0000 UTC m=+908.281926564" lastFinishedPulling="2025-11-24 09:19:04.441958791 +0000 UTC m=+921.700936238" observedRunningTime="2025-11-24 09:19:05.256252722 +0000 UTC m=+922.515230169" watchObservedRunningTime="2025-11-24 09:19:05.259508403 +0000 UTC m=+922.518485850" Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.279683 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8klw7" podStartSLOduration=3.273892463 podStartE2EDuration="41.279632025s" podCreationTimestamp="2025-11-24 09:18:24 +0000 UTC" firstStartedPulling="2025-11-24 09:18:26.176810533 +0000 UTC m=+883.435787980" lastFinishedPulling="2025-11-24 09:19:04.182550094 +0000 UTC m=+921.441527542" observedRunningTime="2025-11-24 09:19:05.277888757 +0000 UTC m=+922.536866205" watchObservedRunningTime="2025-11-24 09:19:05.279632025 +0000 UTC m=+922.538609472" Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.694242 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.789525 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/571d80f9-80d0-4dae-bfa0-126c8055f9b0-catalog-content\") pod \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\" (UID: \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\") " Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.789612 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/571d80f9-80d0-4dae-bfa0-126c8055f9b0-utilities\") pod \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\" (UID: \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\") " Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.789653 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gsh7\" (UniqueName: \"kubernetes.io/projected/571d80f9-80d0-4dae-bfa0-126c8055f9b0-kube-api-access-8gsh7\") pod \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\" (UID: \"571d80f9-80d0-4dae-bfa0-126c8055f9b0\") " Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.790705 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/571d80f9-80d0-4dae-bfa0-126c8055f9b0-utilities" (OuterVolumeSpecName: "utilities") pod "571d80f9-80d0-4dae-bfa0-126c8055f9b0" (UID: "571d80f9-80d0-4dae-bfa0-126c8055f9b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.791039 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/571d80f9-80d0-4dae-bfa0-126c8055f9b0-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.797017 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/571d80f9-80d0-4dae-bfa0-126c8055f9b0-kube-api-access-8gsh7" (OuterVolumeSpecName: "kube-api-access-8gsh7") pod "571d80f9-80d0-4dae-bfa0-126c8055f9b0" (UID: "571d80f9-80d0-4dae-bfa0-126c8055f9b0"). InnerVolumeSpecName "kube-api-access-8gsh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.806988 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/571d80f9-80d0-4dae-bfa0-126c8055f9b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "571d80f9-80d0-4dae-bfa0-126c8055f9b0" (UID: "571d80f9-80d0-4dae-bfa0-126c8055f9b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.893569 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/571d80f9-80d0-4dae-bfa0-126c8055f9b0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:05 crc kubenswrapper[4563]: I1124 09:19:05.893599 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gsh7\" (UniqueName: \"kubernetes.io/projected/571d80f9-80d0-4dae-bfa0-126c8055f9b0-kube-api-access-8gsh7\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.063343 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.133863 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.133911 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.259224 4563 generic.go:334] "Generic (PLEG): container finished" podID="571d80f9-80d0-4dae-bfa0-126c8055f9b0" containerID="b8394dbe6dc07d3f781e10185db336446fa29c2605c94506c32c2fb21e3d0c10" exitCode=0 Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.259301 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rppmx" event={"ID":"571d80f9-80d0-4dae-bfa0-126c8055f9b0","Type":"ContainerDied","Data":"b8394dbe6dc07d3f781e10185db336446fa29c2605c94506c32c2fb21e3d0c10"} Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.259837 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rppmx" event={"ID":"571d80f9-80d0-4dae-bfa0-126c8055f9b0","Type":"ContainerDied","Data":"db91f4523ae27bdd03dd5e4bd236394ff19a0e5fb5172e04beaf6a8d4bafa43d"} Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.259313 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rppmx" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.259899 4563 scope.go:117] "RemoveContainer" containerID="b8394dbe6dc07d3f781e10185db336446fa29c2605c94506c32c2fb21e3d0c10" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.262164 4563 generic.go:334] "Generic (PLEG): container finished" podID="ed99b33d-c985-44bf-9a4a-b9f93bf3927f" containerID="a75fb20e0f9fb563e540d2e760669d37d93203806b15615e19c69e43dc7709ed" exitCode=0 Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.262233 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-frjwb" event={"ID":"ed99b33d-c985-44bf-9a4a-b9f93bf3927f","Type":"ContainerDied","Data":"a75fb20e0f9fb563e540d2e760669d37d93203806b15615e19c69e43dc7709ed"} Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.264350 4563 generic.go:334] "Generic (PLEG): container finished" podID="d803d6ca-646a-4dd5-93ef-d096b501c28a" containerID="1e68b8254c01f5945197761aa93a0a3825f41da8f8a75fadc6d0a90987706bf4" exitCode=0 Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.264425 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8klw7" event={"ID":"d803d6ca-646a-4dd5-93ef-d096b501c28a","Type":"ContainerDied","Data":"1e68b8254c01f5945197761aa93a0a3825f41da8f8a75fadc6d0a90987706bf4"} Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.265932 4563 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.265940 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-stk25" event={"ID":"b0793e21-229f-415e-8b3e-1499e1ed3bf6","Type":"ContainerStarted","Data":"484888d9344f9856ecfdccaba8884e59ceabedfdbe8d25da5cb2812646da6652"} Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.283025 4563 scope.go:117] "RemoveContainer" containerID="7ef7f9add08a8ee7aa538215f14ed78667ffa8fcd949150630626fd6be2c4a1f" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.318251 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rppmx"] Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.328866 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-stk25" podStartSLOduration=3.359407206 podStartE2EDuration="42.328843323s" podCreationTimestamp="2025-11-24 09:18:24 +0000 UTC" firstStartedPulling="2025-11-24 09:18:25.672738211 +0000 UTC m=+882.931715657" lastFinishedPulling="2025-11-24 09:19:04.642174327 +0000 UTC m=+921.901151774" observedRunningTime="2025-11-24 09:19:06.315962159 +0000 UTC m=+923.574939606" watchObservedRunningTime="2025-11-24 09:19:06.328843323 +0000 UTC m=+923.587820769" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.329128 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rppmx"] Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.374344 4563 scope.go:117] "RemoveContainer" containerID="6c4a123f90c7457838fb2c4f3802362d72bc951ffa13b466cdf2216ed6484ad1" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.408869 4563 scope.go:117] "RemoveContainer" containerID="b8394dbe6dc07d3f781e10185db336446fa29c2605c94506c32c2fb21e3d0c10" Nov 24 09:19:06 crc kubenswrapper[4563]: E1124 09:19:06.410790 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8394dbe6dc07d3f781e10185db336446fa29c2605c94506c32c2fb21e3d0c10\": container with ID starting with b8394dbe6dc07d3f781e10185db336446fa29c2605c94506c32c2fb21e3d0c10 not found: ID does not exist" containerID="b8394dbe6dc07d3f781e10185db336446fa29c2605c94506c32c2fb21e3d0c10" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.410844 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8394dbe6dc07d3f781e10185db336446fa29c2605c94506c32c2fb21e3d0c10"} err="failed to get container status \"b8394dbe6dc07d3f781e10185db336446fa29c2605c94506c32c2fb21e3d0c10\": rpc error: code = NotFound desc = could not find container \"b8394dbe6dc07d3f781e10185db336446fa29c2605c94506c32c2fb21e3d0c10\": container with ID starting with b8394dbe6dc07d3f781e10185db336446fa29c2605c94506c32c2fb21e3d0c10 not found: ID does not exist" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.410880 4563 scope.go:117] "RemoveContainer" containerID="7ef7f9add08a8ee7aa538215f14ed78667ffa8fcd949150630626fd6be2c4a1f" Nov 24 09:19:06 crc kubenswrapper[4563]: E1124 09:19:06.414715 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ef7f9add08a8ee7aa538215f14ed78667ffa8fcd949150630626fd6be2c4a1f\": container with ID starting with 7ef7f9add08a8ee7aa538215f14ed78667ffa8fcd949150630626fd6be2c4a1f not found: ID does not exist" containerID="7ef7f9add08a8ee7aa538215f14ed78667ffa8fcd949150630626fd6be2c4a1f" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.414751 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ef7f9add08a8ee7aa538215f14ed78667ffa8fcd949150630626fd6be2c4a1f"} err="failed to get container status \"7ef7f9add08a8ee7aa538215f14ed78667ffa8fcd949150630626fd6be2c4a1f\": rpc error: code = NotFound desc = could not find container \"7ef7f9add08a8ee7aa538215f14ed78667ffa8fcd949150630626fd6be2c4a1f\": container with ID starting with 7ef7f9add08a8ee7aa538215f14ed78667ffa8fcd949150630626fd6be2c4a1f not found: ID does not exist" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.414774 4563 scope.go:117] "RemoveContainer" containerID="6c4a123f90c7457838fb2c4f3802362d72bc951ffa13b466cdf2216ed6484ad1" Nov 24 09:19:06 crc kubenswrapper[4563]: E1124 09:19:06.419128 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c4a123f90c7457838fb2c4f3802362d72bc951ffa13b466cdf2216ed6484ad1\": container with ID starting with 6c4a123f90c7457838fb2c4f3802362d72bc951ffa13b466cdf2216ed6484ad1 not found: ID does not exist" containerID="6c4a123f90c7457838fb2c4f3802362d72bc951ffa13b466cdf2216ed6484ad1" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.419168 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c4a123f90c7457838fb2c4f3802362d72bc951ffa13b466cdf2216ed6484ad1"} err="failed to get container status \"6c4a123f90c7457838fb2c4f3802362d72bc951ffa13b466cdf2216ed6484ad1\": rpc error: code = NotFound desc = could not find container \"6c4a123f90c7457838fb2c4f3802362d72bc951ffa13b466cdf2216ed6484ad1\": container with ID starting with 6c4a123f90c7457838fb2c4f3802362d72bc951ffa13b466cdf2216ed6484ad1 not found: ID does not exist" Nov 24 09:19:06 crc kubenswrapper[4563]: I1124 09:19:06.420050 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 24 09:19:07 crc kubenswrapper[4563]: I1124 09:19:07.065265 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="571d80f9-80d0-4dae-bfa0-126c8055f9b0" path="/var/lib/kubelet/pods/571d80f9-80d0-4dae-bfa0-126c8055f9b0/volumes" Nov 24 09:19:07 crc kubenswrapper[4563]: I1124 09:19:07.191048 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r9lpl" podUID="e26141fd-4cfa-4726-ba65-1f3bb830411b" containerName="registry-server" probeResult="failure" output=< Nov 24 09:19:07 crc kubenswrapper[4563]: timeout: failed to connect service ":50051" within 1s Nov 24 09:19:07 crc kubenswrapper[4563]: > Nov 24 09:19:09 crc kubenswrapper[4563]: I1124 09:19:09.301776 4563 generic.go:334] "Generic (PLEG): container finished" podID="b0793e21-229f-415e-8b3e-1499e1ed3bf6" containerID="484888d9344f9856ecfdccaba8884e59ceabedfdbe8d25da5cb2812646da6652" exitCode=0 Nov 24 09:19:09 crc kubenswrapper[4563]: I1124 09:19:09.301828 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-stk25" event={"ID":"b0793e21-229f-415e-8b3e-1499e1ed3bf6","Type":"ContainerDied","Data":"484888d9344f9856ecfdccaba8884e59ceabedfdbe8d25da5cb2812646da6652"} Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.257390 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-frjwb" Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.312875 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-frjwb" event={"ID":"ed99b33d-c985-44bf-9a4a-b9f93bf3927f","Type":"ContainerDied","Data":"7b97aaa473874328727049b0f2899fb42ed3527173e5d4c2f6250f1bfa1266b6"} Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.312915 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b97aaa473874328727049b0f2899fb42ed3527173e5d4c2f6250f1bfa1266b6" Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.312926 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-frjwb" Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.399940 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjgcb\" (UniqueName: \"kubernetes.io/projected/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-kube-api-access-tjgcb\") pod \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\" (UID: \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\") " Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.400107 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-db-sync-config-data\") pod \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\" (UID: \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\") " Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.400212 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-combined-ca-bundle\") pod \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\" (UID: \"ed99b33d-c985-44bf-9a4a-b9f93bf3927f\") " Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.406035 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ed99b33d-c985-44bf-9a4a-b9f93bf3927f" (UID: "ed99b33d-c985-44bf-9a4a-b9f93bf3927f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.407200 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-kube-api-access-tjgcb" (OuterVolumeSpecName: "kube-api-access-tjgcb") pod "ed99b33d-c985-44bf-9a4a-b9f93bf3927f" (UID: "ed99b33d-c985-44bf-9a4a-b9f93bf3927f"). InnerVolumeSpecName "kube-api-access-tjgcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.433209 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed99b33d-c985-44bf-9a4a-b9f93bf3927f" (UID: "ed99b33d-c985-44bf-9a4a-b9f93bf3927f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.502940 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.502974 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjgcb\" (UniqueName: \"kubernetes.io/projected/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-kube-api-access-tjgcb\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.502989 4563 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed99b33d-c985-44bf-9a4a-b9f93bf3927f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.863001 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8klw7" Nov 24 09:19:10 crc kubenswrapper[4563]: I1124 09:19:10.866100 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-stk25" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.021256 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-db-sync-config-data\") pod \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.021362 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-config-data\") pod \"d803d6ca-646a-4dd5-93ef-d096b501c28a\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.021460 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-scripts\") pod \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.021518 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-config-data\") pod \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.021585 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-combined-ca-bundle\") pod \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.021659 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnr86\" (UniqueName: \"kubernetes.io/projected/b0793e21-229f-415e-8b3e-1499e1ed3bf6-kube-api-access-wnr86\") pod \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.021693 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npgkh\" (UniqueName: \"kubernetes.io/projected/d803d6ca-646a-4dd5-93ef-d096b501c28a-kube-api-access-npgkh\") pod \"d803d6ca-646a-4dd5-93ef-d096b501c28a\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.021758 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d803d6ca-646a-4dd5-93ef-d096b501c28a-logs\") pod \"d803d6ca-646a-4dd5-93ef-d096b501c28a\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.021804 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-scripts\") pod \"d803d6ca-646a-4dd5-93ef-d096b501c28a\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.021964 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b0793e21-229f-415e-8b3e-1499e1ed3bf6-etc-machine-id\") pod \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\" (UID: \"b0793e21-229f-415e-8b3e-1499e1ed3bf6\") " Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.022057 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-combined-ca-bundle\") pod \"d803d6ca-646a-4dd5-93ef-d096b501c28a\" (UID: \"d803d6ca-646a-4dd5-93ef-d096b501c28a\") " Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.022177 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0793e21-229f-415e-8b3e-1499e1ed3bf6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b0793e21-229f-415e-8b3e-1499e1ed3bf6" (UID: "b0793e21-229f-415e-8b3e-1499e1ed3bf6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.022493 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d803d6ca-646a-4dd5-93ef-d096b501c28a-logs" (OuterVolumeSpecName: "logs") pod "d803d6ca-646a-4dd5-93ef-d096b501c28a" (UID: "d803d6ca-646a-4dd5-93ef-d096b501c28a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.023167 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d803d6ca-646a-4dd5-93ef-d096b501c28a-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.023208 4563 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b0793e21-229f-415e-8b3e-1499e1ed3bf6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.028315 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-scripts" (OuterVolumeSpecName: "scripts") pod "d803d6ca-646a-4dd5-93ef-d096b501c28a" (UID: "d803d6ca-646a-4dd5-93ef-d096b501c28a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.029776 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d803d6ca-646a-4dd5-93ef-d096b501c28a-kube-api-access-npgkh" (OuterVolumeSpecName: "kube-api-access-npgkh") pod "d803d6ca-646a-4dd5-93ef-d096b501c28a" (UID: "d803d6ca-646a-4dd5-93ef-d096b501c28a"). InnerVolumeSpecName "kube-api-access-npgkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.030014 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-scripts" (OuterVolumeSpecName: "scripts") pod "b0793e21-229f-415e-8b3e-1499e1ed3bf6" (UID: "b0793e21-229f-415e-8b3e-1499e1ed3bf6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.030345 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0793e21-229f-415e-8b3e-1499e1ed3bf6-kube-api-access-wnr86" (OuterVolumeSpecName: "kube-api-access-wnr86") pod "b0793e21-229f-415e-8b3e-1499e1ed3bf6" (UID: "b0793e21-229f-415e-8b3e-1499e1ed3bf6"). InnerVolumeSpecName "kube-api-access-wnr86". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.033300 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b0793e21-229f-415e-8b3e-1499e1ed3bf6" (UID: "b0793e21-229f-415e-8b3e-1499e1ed3bf6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.048850 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-config-data" (OuterVolumeSpecName: "config-data") pod "d803d6ca-646a-4dd5-93ef-d096b501c28a" (UID: "d803d6ca-646a-4dd5-93ef-d096b501c28a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.050158 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0793e21-229f-415e-8b3e-1499e1ed3bf6" (UID: "b0793e21-229f-415e-8b3e-1499e1ed3bf6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.056020 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d803d6ca-646a-4dd5-93ef-d096b501c28a" (UID: "d803d6ca-646a-4dd5-93ef-d096b501c28a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.074212 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-config-data" (OuterVolumeSpecName: "config-data") pod "b0793e21-229f-415e-8b3e-1499e1ed3bf6" (UID: "b0793e21-229f-415e-8b3e-1499e1ed3bf6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.124700 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.124749 4563 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.124760 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.124769 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.124777 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.124786 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0793e21-229f-415e-8b3e-1499e1ed3bf6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.124796 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnr86\" (UniqueName: \"kubernetes.io/projected/b0793e21-229f-415e-8b3e-1499e1ed3bf6-kube-api-access-wnr86\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.124806 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npgkh\" (UniqueName: \"kubernetes.io/projected/d803d6ca-646a-4dd5-93ef-d096b501c28a-kube-api-access-npgkh\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.124816 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d803d6ca-646a-4dd5-93ef-d096b501c28a-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.323860 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-stk25" event={"ID":"b0793e21-229f-415e-8b3e-1499e1ed3bf6","Type":"ContainerDied","Data":"a959c89158ac6d8363a77fa25e41b421a80c8d98331aa61e9bf5e15b6c0e6f5c"} Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.324058 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a959c89158ac6d8363a77fa25e41b421a80c8d98331aa61e9bf5e15b6c0e6f5c" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.323900 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-stk25" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.329822 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8klw7" event={"ID":"d803d6ca-646a-4dd5-93ef-d096b501c28a","Type":"ContainerDied","Data":"91de1d918813f67bcad3b84860039b1937ba4b42fd6e46c790398237d4bf71a2"} Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.329854 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91de1d918813f67bcad3b84860039b1937ba4b42fd6e46c790398237d4bf71a2" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.329909 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8klw7" Nov 24 09:19:11 crc kubenswrapper[4563]: E1124 09:19:11.427561 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="b995b5b3-41b0-4334-9f7c-792a50e780e7" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.525525 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-64cbb46c46-7fsmj"] Nov 24 09:19:11 crc kubenswrapper[4563]: E1124 09:19:11.528896 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2330283-b2b3-4fe5-9924-bc2f48c08497" containerName="dnsmasq-dns" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.528919 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2330283-b2b3-4fe5-9924-bc2f48c08497" containerName="dnsmasq-dns" Nov 24 09:19:11 crc kubenswrapper[4563]: E1124 09:19:11.528938 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d803d6ca-646a-4dd5-93ef-d096b501c28a" containerName="placement-db-sync" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.528945 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d803d6ca-646a-4dd5-93ef-d096b501c28a" containerName="placement-db-sync" Nov 24 09:19:11 crc kubenswrapper[4563]: E1124 09:19:11.528958 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2330283-b2b3-4fe5-9924-bc2f48c08497" containerName="init" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.528963 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2330283-b2b3-4fe5-9924-bc2f48c08497" containerName="init" Nov 24 09:19:11 crc kubenswrapper[4563]: E1124 09:19:11.528984 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571d80f9-80d0-4dae-bfa0-126c8055f9b0" containerName="registry-server" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.528991 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="571d80f9-80d0-4dae-bfa0-126c8055f9b0" containerName="registry-server" Nov 24 09:19:11 crc kubenswrapper[4563]: E1124 09:19:11.529003 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571d80f9-80d0-4dae-bfa0-126c8055f9b0" containerName="extract-content" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.529008 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="571d80f9-80d0-4dae-bfa0-126c8055f9b0" containerName="extract-content" Nov 24 09:19:11 crc kubenswrapper[4563]: E1124 09:19:11.529018 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed99b33d-c985-44bf-9a4a-b9f93bf3927f" containerName="barbican-db-sync" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.529023 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed99b33d-c985-44bf-9a4a-b9f93bf3927f" containerName="barbican-db-sync" Nov 24 09:19:11 crc kubenswrapper[4563]: E1124 09:19:11.529045 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571d80f9-80d0-4dae-bfa0-126c8055f9b0" containerName="extract-utilities" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.529050 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="571d80f9-80d0-4dae-bfa0-126c8055f9b0" containerName="extract-utilities" Nov 24 09:19:11 crc kubenswrapper[4563]: E1124 09:19:11.529060 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0793e21-229f-415e-8b3e-1499e1ed3bf6" containerName="cinder-db-sync" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.529066 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0793e21-229f-415e-8b3e-1499e1ed3bf6" containerName="cinder-db-sync" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.529236 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2330283-b2b3-4fe5-9924-bc2f48c08497" containerName="dnsmasq-dns" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.529251 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0793e21-229f-415e-8b3e-1499e1ed3bf6" containerName="cinder-db-sync" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.529266 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="d803d6ca-646a-4dd5-93ef-d096b501c28a" containerName="placement-db-sync" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.529279 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="571d80f9-80d0-4dae-bfa0-126c8055f9b0" containerName="registry-server" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.529288 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed99b33d-c985-44bf-9a4a-b9f93bf3927f" containerName="barbican-db-sync" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.530193 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.535545 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.535794 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hv8j4" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.536004 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.546706 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6d65456589-92q4d"] Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.548163 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.550237 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.555435 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64cbb46c46-7fsmj"] Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.564532 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.565870 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.569363 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.569626 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.578573 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.578771 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-llspk" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.580010 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d65456589-92q4d"] Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.585228 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.638072 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17d8ec67-c825-4ab0-bd77-cd610ff6838e-logs\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.638224 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17d8ec67-c825-4ab0-bd77-cd610ff6838e-config-data\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.638470 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17d8ec67-c825-4ab0-bd77-cd610ff6838e-config-data-custom\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.638546 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blpbv\" (UniqueName: \"kubernetes.io/projected/17d8ec67-c825-4ab0-bd77-cd610ff6838e-kube-api-access-blpbv\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.638631 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17d8ec67-c825-4ab0-bd77-cd610ff6838e-combined-ca-bundle\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.647887 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-858c959657-bhzfs"] Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.649380 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.689452 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-858c959657-bhzfs"] Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.740754 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17d8ec67-c825-4ab0-bd77-cd610ff6838e-config-data\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.740841 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-scripts\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.740927 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6hjf\" (UniqueName: \"kubernetes.io/projected/9d88e05b-2750-483f-a0a3-5169e4cc919c-kube-api-access-p6hjf\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.740963 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.741011 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.741047 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d88e05b-2750-483f-a0a3-5169e4cc919c-logs\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.741077 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d88e05b-2750-483f-a0a3-5169e4cc919c-config-data-custom\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.741140 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c44e09d-b8ec-401c-886e-c4e2c589778d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.741158 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d88e05b-2750-483f-a0a3-5169e4cc919c-combined-ca-bundle\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.741192 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17d8ec67-c825-4ab0-bd77-cd610ff6838e-config-data-custom\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.741219 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-config-data\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.741251 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blpbv\" (UniqueName: \"kubernetes.io/projected/17d8ec67-c825-4ab0-bd77-cd610ff6838e-kube-api-access-blpbv\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.741319 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d88e05b-2750-483f-a0a3-5169e4cc919c-config-data\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.741359 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17d8ec67-c825-4ab0-bd77-cd610ff6838e-combined-ca-bundle\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.741388 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qjdw\" (UniqueName: \"kubernetes.io/projected/0c44e09d-b8ec-401c-886e-c4e2c589778d-kube-api-access-5qjdw\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.741440 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17d8ec67-c825-4ab0-bd77-cd610ff6838e-logs\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.741988 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17d8ec67-c825-4ab0-bd77-cd610ff6838e-logs\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.749546 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17d8ec67-c825-4ab0-bd77-cd610ff6838e-config-data-custom\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.758342 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17d8ec67-c825-4ab0-bd77-cd610ff6838e-config-data\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.758850 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17d8ec67-c825-4ab0-bd77-cd610ff6838e-combined-ca-bundle\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.763440 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blpbv\" (UniqueName: \"kubernetes.io/projected/17d8ec67-c825-4ab0-bd77-cd610ff6838e-kube-api-access-blpbv\") pod \"barbican-keystone-listener-64cbb46c46-7fsmj\" (UID: \"17d8ec67-c825-4ab0-bd77-cd610ff6838e\") " pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.785837 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-858c959657-bhzfs"] Nov 24 09:19:11 crc kubenswrapper[4563]: E1124 09:19:11.786907 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-jswpc ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-858c959657-bhzfs" podUID="59ac890a-e396-499d-90de-dc8203ebb156" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.843738 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.843780 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d88e05b-2750-483f-a0a3-5169e4cc919c-logs\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.843813 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-config\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.843834 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d88e05b-2750-483f-a0a3-5169e4cc919c-config-data-custom\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.843891 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c44e09d-b8ec-401c-886e-c4e2c589778d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.843905 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d88e05b-2750-483f-a0a3-5169e4cc919c-combined-ca-bundle\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.843930 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-ovsdbserver-nb\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.843952 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-ovsdbserver-sb\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.843972 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-config-data\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.844054 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d88e05b-2750-483f-a0a3-5169e4cc919c-config-data\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.844073 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qjdw\" (UniqueName: \"kubernetes.io/projected/0c44e09d-b8ec-401c-886e-c4e2c589778d-kube-api-access-5qjdw\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.844090 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-dns-svc\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.844112 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jswpc\" (UniqueName: \"kubernetes.io/projected/59ac890a-e396-499d-90de-dc8203ebb156-kube-api-access-jswpc\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.844141 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-dns-swift-storage-0\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.844230 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-scripts\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.844289 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6hjf\" (UniqueName: \"kubernetes.io/projected/9d88e05b-2750-483f-a0a3-5169e4cc919c-kube-api-access-p6hjf\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.844309 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.851083 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-797bbc649-f7c7l"] Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.852211 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c44e09d-b8ec-401c-886e-c4e2c589778d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.852510 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d88e05b-2750-483f-a0a3-5169e4cc919c-logs\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.855803 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d88e05b-2750-483f-a0a3-5169e4cc919c-combined-ca-bundle\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.857418 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.858050 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.858199 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d88e05b-2750-483f-a0a3-5169e4cc919c-config-data-custom\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.859746 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-scripts\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.863960 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.864523 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-config-data\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.868606 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d88e05b-2750-483f-a0a3-5169e4cc919c-config-data\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.869046 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.878141 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qjdw\" (UniqueName: \"kubernetes.io/projected/0c44e09d-b8ec-401c-886e-c4e2c589778d-kube-api-access-5qjdw\") pod \"cinder-scheduler-0\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.883325 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6hjf\" (UniqueName: \"kubernetes.io/projected/9d88e05b-2750-483f-a0a3-5169e4cc919c-kube-api-access-p6hjf\") pod \"barbican-worker-6d65456589-92q4d\" (UID: \"9d88e05b-2750-483f-a0a3-5169e4cc919c\") " pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.911109 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.911631 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-797bbc649-f7c7l"] Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.936902 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-787479d464-54bqr"] Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.938442 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.940192 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.955435 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-dns-svc\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.955473 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jswpc\" (UniqueName: \"kubernetes.io/projected/59ac890a-e396-499d-90de-dc8203ebb156-kube-api-access-jswpc\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.955504 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-dns-swift-storage-0\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.955593 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-config\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.955646 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-ovsdbserver-nb\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.955667 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-ovsdbserver-sb\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.956506 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-ovsdbserver-sb\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.957057 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-dns-swift-storage-0\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.962155 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-config\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.972416 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-ovsdbserver-nb\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.980295 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-dns-svc\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:11 crc kubenswrapper[4563]: I1124 09:19:11.984026 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jswpc\" (UniqueName: \"kubernetes.io/projected/59ac890a-e396-499d-90de-dc8203ebb156-kube-api-access-jswpc\") pod \"dnsmasq-dns-858c959657-bhzfs\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.010940 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-787479d464-54bqr"] Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.041936 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.081346 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxsj7\" (UniqueName: \"kubernetes.io/projected/f99d33d2-61e7-452c-9032-f2be6301ac6d-kube-api-access-kxsj7\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.081477 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hks9t\" (UniqueName: \"kubernetes.io/projected/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-kube-api-access-hks9t\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.081522 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-dns-swift-storage-0\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.081578 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-config-data-custom\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.081597 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-dns-svc\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.081631 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-config-data\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.081839 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-config\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.081943 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-combined-ca-bundle\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.081964 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-ovsdbserver-nb\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.082048 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-ovsdbserver-sb\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.082269 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-logs\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.087968 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.092793 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.098626 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.165668 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d65456589-92q4d" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.171863 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-748c4bdffd-w974j"] Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.183868 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.185747 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.185766 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-logs\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.185804 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm4hd\" (UniqueName: \"kubernetes.io/projected/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-kube-api-access-qm4hd\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.185829 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxsj7\" (UniqueName: \"kubernetes.io/projected/f99d33d2-61e7-452c-9032-f2be6301ac6d-kube-api-access-kxsj7\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.185854 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-config-data\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.185872 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-scripts\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.185894 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hks9t\" (UniqueName: \"kubernetes.io/projected/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-kube-api-access-hks9t\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.185918 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-dns-swift-storage-0\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.185943 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-config-data-custom\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.185958 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-dns-svc\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.185997 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-config-data\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.186040 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.186057 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.186079 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-config-data-custom\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.186112 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-config\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.186128 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-logs\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.186147 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-combined-ca-bundle\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.186169 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-ovsdbserver-nb\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.186201 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-ovsdbserver-sb\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.186456 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-logs\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.187968 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.188534 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.188813 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.189001 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8g5hg" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.189394 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-dns-swift-storage-0\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.192294 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-dns-svc\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.193073 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-ovsdbserver-nb\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.193745 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-748c4bdffd-w974j"] Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.194876 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-ovsdbserver-sb\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.199770 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-config\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.202626 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-config-data-custom\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.225520 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-combined-ca-bundle\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.226525 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-config-data\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.237820 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hks9t\" (UniqueName: \"kubernetes.io/projected/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-kube-api-access-hks9t\") pod \"barbican-api-787479d464-54bqr\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.240203 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxsj7\" (UniqueName: \"kubernetes.io/projected/f99d33d2-61e7-452c-9032-f2be6301ac6d-kube-api-access-kxsj7\") pod \"dnsmasq-dns-797bbc649-f7c7l\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.286947 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-scripts\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.287193 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-combined-ca-bundle\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.287224 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9103bb32-e426-4c4b-ade8-d3430cf5ca11-logs\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.287239 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-config-data\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.287282 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.287297 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.287317 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-config-data-custom\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.287334 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-internal-tls-certs\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.287356 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-public-tls-certs\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.287388 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-logs\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.287431 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gf9g\" (UniqueName: \"kubernetes.io/projected/9103bb32-e426-4c4b-ade8-d3430cf5ca11-kube-api-access-7gf9g\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.287452 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm4hd\" (UniqueName: \"kubernetes.io/projected/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-kube-api-access-qm4hd\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.287477 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-config-data\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.287500 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-scripts\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.292077 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-logs\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.293746 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.295556 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-scripts\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.296579 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-config-data\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.306323 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.307992 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm4hd\" (UniqueName: \"kubernetes.io/projected/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-kube-api-access-qm4hd\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.320944 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-config-data-custom\") pod \"cinder-api-0\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.321620 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.358474 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.359452 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerName="ceilometer-notification-agent" containerID="cri-o://714c30487c4e7dfc7a7cf3f6168922dc32da768f7e7dbe76847800c55047ae48" gracePeriod=30 Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.359683 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b995b5b3-41b0-4334-9f7c-792a50e780e7","Type":"ContainerStarted","Data":"3415a060233e9b99461d2d14ff354bed25e3e9025c2ceeeda7ae7692d24374e4"} Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.359911 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.359951 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerName="proxy-httpd" containerID="cri-o://3415a060233e9b99461d2d14ff354bed25e3e9025c2ceeeda7ae7692d24374e4" gracePeriod=30 Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.359999 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerName="sg-core" containerID="cri-o://3991af0b0f902f2e913b758214f68941b05dd8ebc1d3590242e7b910416be4c7" gracePeriod=30 Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.388690 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-internal-tls-certs\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.388743 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-public-tls-certs\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.388793 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gf9g\" (UniqueName: \"kubernetes.io/projected/9103bb32-e426-4c4b-ade8-d3430cf5ca11-kube-api-access-7gf9g\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.388837 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-scripts\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.388855 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-combined-ca-bundle\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.388881 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9103bb32-e426-4c4b-ade8-d3430cf5ca11-logs\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.388897 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-config-data\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.389578 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.391298 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9103bb32-e426-4c4b-ade8-d3430cf5ca11-logs\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.393572 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-config-data\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.393848 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-internal-tls-certs\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.394008 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-public-tls-certs\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.394939 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-combined-ca-bundle\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.400092 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9103bb32-e426-4c4b-ade8-d3430cf5ca11-scripts\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.403232 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gf9g\" (UniqueName: \"kubernetes.io/projected/9103bb32-e426-4c4b-ade8-d3430cf5ca11-kube-api-access-7gf9g\") pod \"placement-748c4bdffd-w974j\" (UID: \"9103bb32-e426-4c4b-ade8-d3430cf5ca11\") " pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.469278 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.482191 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.490620 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-dns-swift-storage-0\") pod \"59ac890a-e396-499d-90de-dc8203ebb156\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.490716 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-ovsdbserver-nb\") pod \"59ac890a-e396-499d-90de-dc8203ebb156\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.490797 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-config\") pod \"59ac890a-e396-499d-90de-dc8203ebb156\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.490882 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-dns-svc\") pod \"59ac890a-e396-499d-90de-dc8203ebb156\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.490924 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jswpc\" (UniqueName: \"kubernetes.io/projected/59ac890a-e396-499d-90de-dc8203ebb156-kube-api-access-jswpc\") pod \"59ac890a-e396-499d-90de-dc8203ebb156\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.490947 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-ovsdbserver-sb\") pod \"59ac890a-e396-499d-90de-dc8203ebb156\" (UID: \"59ac890a-e396-499d-90de-dc8203ebb156\") " Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.491094 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "59ac890a-e396-499d-90de-dc8203ebb156" (UID: "59ac890a-e396-499d-90de-dc8203ebb156"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.491446 4563 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.491472 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-config" (OuterVolumeSpecName: "config") pod "59ac890a-e396-499d-90de-dc8203ebb156" (UID: "59ac890a-e396-499d-90de-dc8203ebb156"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.491671 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "59ac890a-e396-499d-90de-dc8203ebb156" (UID: "59ac890a-e396-499d-90de-dc8203ebb156"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.491803 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "59ac890a-e396-499d-90de-dc8203ebb156" (UID: "59ac890a-e396-499d-90de-dc8203ebb156"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.492478 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "59ac890a-e396-499d-90de-dc8203ebb156" (UID: "59ac890a-e396-499d-90de-dc8203ebb156"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.496849 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.497601 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ac890a-e396-499d-90de-dc8203ebb156-kube-api-access-jswpc" (OuterVolumeSpecName: "kube-api-access-jswpc") pod "59ac890a-e396-499d-90de-dc8203ebb156" (UID: "59ac890a-e396-499d-90de-dc8203ebb156"). InnerVolumeSpecName "kube-api-access-jswpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:12 crc kubenswrapper[4563]: W1124 09:19:12.518966 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c44e09d_b8ec_401c_886e_c4e2c589778d.slice/crio-58166f2474d31f06367288962ee11279db40d91c22965a289eb50ef20b7bead4 WatchSource:0}: Error finding container 58166f2474d31f06367288962ee11279db40d91c22965a289eb50ef20b7bead4: Status 404 returned error can't find the container with id 58166f2474d31f06367288962ee11279db40d91c22965a289eb50ef20b7bead4 Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.592868 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.592897 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.592907 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.592917 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jswpc\" (UniqueName: \"kubernetes.io/projected/59ac890a-e396-499d-90de-dc8203ebb156-kube-api-access-jswpc\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.592927 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59ac890a-e396-499d-90de-dc8203ebb156-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.638075 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-64cbb46c46-7fsmj"] Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.685904 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.727684 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d65456589-92q4d"] Nov 24 09:19:12 crc kubenswrapper[4563]: W1124 09:19:12.735431 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d88e05b_2750_483f_a0a3_5169e4cc919c.slice/crio-2308e22b9b49d65aadff4f824fcffa3743c44eb10596a71235b07c756fc3857c WatchSource:0}: Error finding container 2308e22b9b49d65aadff4f824fcffa3743c44eb10596a71235b07c756fc3857c: Status 404 returned error can't find the container with id 2308e22b9b49d65aadff4f824fcffa3743c44eb10596a71235b07c756fc3857c Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.813334 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-787479d464-54bqr"] Nov 24 09:19:12 crc kubenswrapper[4563]: W1124 09:19:12.818965 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod626cf8d9_6c74_4bd2_a62f_206bd9ed3c90.slice/crio-20bc5a5a4959c7d953ca940a0f63dd0b9dfb752b785a5ea47654e28d9e0404ff WatchSource:0}: Error finding container 20bc5a5a4959c7d953ca940a0f63dd0b9dfb752b785a5ea47654e28d9e0404ff: Status 404 returned error can't find the container with id 20bc5a5a4959c7d953ca940a0f63dd0b9dfb752b785a5ea47654e28d9e0404ff Nov 24 09:19:12 crc kubenswrapper[4563]: I1124 09:19:12.933011 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 24 09:19:12 crc kubenswrapper[4563]: W1124 09:19:12.935192 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a0215ab_b564_47d7_a2fc_f32bd768fa8e.slice/crio-92c191215d90c713706d38f5c1f77f64db767073608344f830e1971070165962 WatchSource:0}: Error finding container 92c191215d90c713706d38f5c1f77f64db767073608344f830e1971070165962: Status 404 returned error can't find the container with id 92c191215d90c713706d38f5c1f77f64db767073608344f830e1971070165962 Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.004051 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-797bbc649-f7c7l"] Nov 24 09:19:13 crc kubenswrapper[4563]: W1124 09:19:13.011092 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf99d33d2_61e7_452c_9032_f2be6301ac6d.slice/crio-a4f76e64693781f3918d25a3c57fa34985b82491d7e9fc266dd5fbd41d4859d6 WatchSource:0}: Error finding container a4f76e64693781f3918d25a3c57fa34985b82491d7e9fc266dd5fbd41d4859d6: Status 404 returned error can't find the container with id a4f76e64693781f3918d25a3c57fa34985b82491d7e9fc266dd5fbd41d4859d6 Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.135070 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-748c4bdffd-w974j"] Nov 24 09:19:13 crc kubenswrapper[4563]: E1124 09:19:13.307343 4563 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59ac890a_e396_499d_90de_dc8203ebb156.slice\": RecentStats: unable to find data in memory cache]" Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.382079 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-748c4bdffd-w974j" event={"ID":"9103bb32-e426-4c4b-ade8-d3430cf5ca11","Type":"ContainerStarted","Data":"044cdc60881790c7a0f572ebd3dd23468c2700ec2e8bd3f7e7ee444e54ee198f"} Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.388681 4563 generic.go:334] "Generic (PLEG): container finished" podID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerID="3415a060233e9b99461d2d14ff354bed25e3e9025c2ceeeda7ae7692d24374e4" exitCode=0 Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.388709 4563 generic.go:334] "Generic (PLEG): container finished" podID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerID="3991af0b0f902f2e913b758214f68941b05dd8ebc1d3590242e7b910416be4c7" exitCode=2 Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.389003 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b995b5b3-41b0-4334-9f7c-792a50e780e7","Type":"ContainerDied","Data":"3415a060233e9b99461d2d14ff354bed25e3e9025c2ceeeda7ae7692d24374e4"} Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.389037 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b995b5b3-41b0-4334-9f7c-792a50e780e7","Type":"ContainerDied","Data":"3991af0b0f902f2e913b758214f68941b05dd8ebc1d3590242e7b910416be4c7"} Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.393209 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787479d464-54bqr" event={"ID":"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90","Type":"ContainerStarted","Data":"8f017481fbaa8c5cb10be0c34acda2f08ea259d424d7ae7f802c7c4a2f91b14d"} Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.393244 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787479d464-54bqr" event={"ID":"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90","Type":"ContainerStarted","Data":"a0cf758c57b6cc78147c5f06280ea898ccd750ead778e95095d7f8d2a4da1b9b"} Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.393253 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787479d464-54bqr" event={"ID":"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90","Type":"ContainerStarted","Data":"20bc5a5a4959c7d953ca940a0f63dd0b9dfb752b785a5ea47654e28d9e0404ff"} Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.393705 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.393759 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.396523 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a0215ab-b564-47d7-a2fc-f32bd768fa8e","Type":"ContainerStarted","Data":"92c191215d90c713706d38f5c1f77f64db767073608344f830e1971070165962"} Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.398186 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-797bbc649-f7c7l" event={"ID":"f99d33d2-61e7-452c-9032-f2be6301ac6d","Type":"ContainerStarted","Data":"a4f76e64693781f3918d25a3c57fa34985b82491d7e9fc266dd5fbd41d4859d6"} Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.401313 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" event={"ID":"17d8ec67-c825-4ab0-bd77-cd610ff6838e","Type":"ContainerStarted","Data":"20166ad1e63460948ab1e3380264be35f734841cfd75a607a31db33122e53d77"} Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.403149 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d65456589-92q4d" event={"ID":"9d88e05b-2750-483f-a0a3-5169e4cc919c","Type":"ContainerStarted","Data":"2308e22b9b49d65aadff4f824fcffa3743c44eb10596a71235b07c756fc3857c"} Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.406420 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-858c959657-bhzfs" Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.406653 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0c44e09d-b8ec-401c-886e-c4e2c589778d","Type":"ContainerStarted","Data":"58166f2474d31f06367288962ee11279db40d91c22965a289eb50ef20b7bead4"} Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.415942 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-787479d464-54bqr" podStartSLOduration=2.415925381 podStartE2EDuration="2.415925381s" podCreationTimestamp="2025-11-24 09:19:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:19:13.409430321 +0000 UTC m=+930.668407768" watchObservedRunningTime="2025-11-24 09:19:13.415925381 +0000 UTC m=+930.674902828" Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.451824 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-858c959657-bhzfs"] Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.462533 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-858c959657-bhzfs"] Nov 24 09:19:13 crc kubenswrapper[4563]: I1124 09:19:13.794450 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 24 09:19:14 crc kubenswrapper[4563]: I1124 09:19:14.420086 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a0215ab-b564-47d7-a2fc-f32bd768fa8e","Type":"ContainerStarted","Data":"288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c"} Nov 24 09:19:14 crc kubenswrapper[4563]: I1124 09:19:14.422008 4563 generic.go:334] "Generic (PLEG): container finished" podID="f99d33d2-61e7-452c-9032-f2be6301ac6d" containerID="15ccc2076e8db6b3ce09ce2b1dc14c3d41f18f90f1b46bd31571188c80a2901c" exitCode=0 Nov 24 09:19:14 crc kubenswrapper[4563]: I1124 09:19:14.422078 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-797bbc649-f7c7l" event={"ID":"f99d33d2-61e7-452c-9032-f2be6301ac6d","Type":"ContainerDied","Data":"15ccc2076e8db6b3ce09ce2b1dc14c3d41f18f90f1b46bd31571188c80a2901c"} Nov 24 09:19:14 crc kubenswrapper[4563]: I1124 09:19:14.427617 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-748c4bdffd-w974j" event={"ID":"9103bb32-e426-4c4b-ade8-d3430cf5ca11","Type":"ContainerStarted","Data":"0d25152327993dc5523928e9aa1f2b6a235b00aa99e5c0469e412de25ff7842e"} Nov 24 09:19:14 crc kubenswrapper[4563]: I1124 09:19:14.427681 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-748c4bdffd-w974j" event={"ID":"9103bb32-e426-4c4b-ade8-d3430cf5ca11","Type":"ContainerStarted","Data":"fdfd4cf18e741880864c5e9b39f68cc3ed1c4ac4f23313cabb78232b2e96d3d4"} Nov 24 09:19:14 crc kubenswrapper[4563]: I1124 09:19:14.427700 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:14 crc kubenswrapper[4563]: I1124 09:19:14.427866 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:14 crc kubenswrapper[4563]: I1124 09:19:14.464051 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-748c4bdffd-w974j" podStartSLOduration=2.464034599 podStartE2EDuration="2.464034599s" podCreationTimestamp="2025-11-24 09:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:19:14.457406378 +0000 UTC m=+931.716383826" watchObservedRunningTime="2025-11-24 09:19:14.464034599 +0000 UTC m=+931.723012046" Nov 24 09:19:15 crc kubenswrapper[4563]: I1124 09:19:15.064449 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59ac890a-e396-499d-90de-dc8203ebb156" path="/var/lib/kubelet/pods/59ac890a-e396-499d-90de-dc8203ebb156/volumes" Nov 24 09:19:15 crc kubenswrapper[4563]: I1124 09:19:15.445355 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0c44e09d-b8ec-401c-886e-c4e2c589778d","Type":"ContainerStarted","Data":"85d9e9edf50dbd2445fbe782603b12460572db0d385d8f922ad67f5bdd1e8548"} Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.174434 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.233780 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r9lpl" Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.290608 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r9lpl"] Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.315798 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.408705 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r6r55"] Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.408928 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r6r55" podUID="2704a13f-1433-4804-8818-e433c50beff1" containerName="registry-server" containerID="cri-o://92d72b407fd77954e41bd4d805f09771f3a11466e8cc0d004866b72dfb651b10" gracePeriod=2 Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.473404 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d65456589-92q4d" event={"ID":"9d88e05b-2750-483f-a0a3-5169e4cc919c","Type":"ContainerStarted","Data":"33bd474a31e2fc0c9d214ddfd4a8f4c2cf1435a90f8569cad267a97b92e1ba7c"} Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.473473 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d65456589-92q4d" event={"ID":"9d88e05b-2750-483f-a0a3-5169e4cc919c","Type":"ContainerStarted","Data":"d7450b01897f3c867ffddaecbcbc44bd340860a0027b2315721b3bf06ca65dc0"} Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.485397 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0c44e09d-b8ec-401c-886e-c4e2c589778d","Type":"ContainerStarted","Data":"28c6f9f3742b86022ae1f8f63ea3f21410a0a3fde2b639019fdfba51ced1f206"} Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.489917 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a0215ab-b564-47d7-a2fc-f32bd768fa8e","Type":"ContainerStarted","Data":"a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63"} Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.490034 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6a0215ab-b564-47d7-a2fc-f32bd768fa8e" containerName="cinder-api-log" containerID="cri-o://288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c" gracePeriod=30 Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.490118 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.490152 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6a0215ab-b564-47d7-a2fc-f32bd768fa8e" containerName="cinder-api" containerID="cri-o://a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63" gracePeriod=30 Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.508148 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6d65456589-92q4d" podStartSLOduration=2.979793468 podStartE2EDuration="5.508133889s" podCreationTimestamp="2025-11-24 09:19:11 +0000 UTC" firstStartedPulling="2025-11-24 09:19:12.739502624 +0000 UTC m=+929.998480072" lastFinishedPulling="2025-11-24 09:19:15.267843045 +0000 UTC m=+932.526820493" observedRunningTime="2025-11-24 09:19:16.488927167 +0000 UTC m=+933.747904613" watchObservedRunningTime="2025-11-24 09:19:16.508133889 +0000 UTC m=+933.767111337" Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.512702 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-797bbc649-f7c7l" event={"ID":"f99d33d2-61e7-452c-9032-f2be6301ac6d","Type":"ContainerStarted","Data":"cb61e886c7e087b2b2709b91942be4dc5c8db8b656ed0610f97ff184e29ff520"} Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.512777 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.515353 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.552303158 podStartE2EDuration="5.515342254s" podCreationTimestamp="2025-11-24 09:19:11 +0000 UTC" firstStartedPulling="2025-11-24 09:19:12.531844611 +0000 UTC m=+929.790822057" lastFinishedPulling="2025-11-24 09:19:13.494883707 +0000 UTC m=+930.753861153" observedRunningTime="2025-11-24 09:19:16.506700176 +0000 UTC m=+933.765677624" watchObservedRunningTime="2025-11-24 09:19:16.515342254 +0000 UTC m=+933.774319702" Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.522993 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" event={"ID":"17d8ec67-c825-4ab0-bd77-cd610ff6838e","Type":"ContainerStarted","Data":"f2c7d5c10c72d62c78ff1547aebab3e5c68d3a21c4eddb7afe6a6730ad103da3"} Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.523027 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" event={"ID":"17d8ec67-c825-4ab0-bd77-cd610ff6838e","Type":"ContainerStarted","Data":"a9d1eb174304bc90acc064d9106e83dced0df78bd5549d62f7c137e98a89819a"} Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.569026 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-64cbb46c46-7fsmj" podStartSLOduration=2.946460783 podStartE2EDuration="5.569007247s" podCreationTimestamp="2025-11-24 09:19:11 +0000 UTC" firstStartedPulling="2025-11-24 09:19:12.644356078 +0000 UTC m=+929.903333525" lastFinishedPulling="2025-11-24 09:19:15.266902542 +0000 UTC m=+932.525879989" observedRunningTime="2025-11-24 09:19:16.56205275 +0000 UTC m=+933.821030198" watchObservedRunningTime="2025-11-24 09:19:16.569007247 +0000 UTC m=+933.827984694" Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.569135 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.5691302799999995 podStartE2EDuration="5.56913028s" podCreationTimestamp="2025-11-24 09:19:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:19:16.535551149 +0000 UTC m=+933.794528586" watchObservedRunningTime="2025-11-24 09:19:16.56913028 +0000 UTC m=+933.828107727" Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.605683 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-797bbc649-f7c7l" podStartSLOduration=5.605664023 podStartE2EDuration="5.605664023s" podCreationTimestamp="2025-11-24 09:19:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:19:16.580332287 +0000 UTC m=+933.839309734" watchObservedRunningTime="2025-11-24 09:19:16.605664023 +0000 UTC m=+933.864641470" Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.627197 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.915745 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 24 09:19:16 crc kubenswrapper[4563]: I1124 09:19:16.928795 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.010029 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2704a13f-1433-4804-8818-e433c50beff1-catalog-content\") pod \"2704a13f-1433-4804-8818-e433c50beff1\" (UID: \"2704a13f-1433-4804-8818-e433c50beff1\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.010186 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgpr8\" (UniqueName: \"kubernetes.io/projected/2704a13f-1433-4804-8818-e433c50beff1-kube-api-access-pgpr8\") pod \"2704a13f-1433-4804-8818-e433c50beff1\" (UID: \"2704a13f-1433-4804-8818-e433c50beff1\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.010223 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2704a13f-1433-4804-8818-e433c50beff1-utilities\") pod \"2704a13f-1433-4804-8818-e433c50beff1\" (UID: \"2704a13f-1433-4804-8818-e433c50beff1\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.011531 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2704a13f-1433-4804-8818-e433c50beff1-utilities" (OuterVolumeSpecName: "utilities") pod "2704a13f-1433-4804-8818-e433c50beff1" (UID: "2704a13f-1433-4804-8818-e433c50beff1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.028126 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2704a13f-1433-4804-8818-e433c50beff1-kube-api-access-pgpr8" (OuterVolumeSpecName: "kube-api-access-pgpr8") pod "2704a13f-1433-4804-8818-e433c50beff1" (UID: "2704a13f-1433-4804-8818-e433c50beff1"). InnerVolumeSpecName "kube-api-access-pgpr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.086855 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2704a13f-1433-4804-8818-e433c50beff1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2704a13f-1433-4804-8818-e433c50beff1" (UID: "2704a13f-1433-4804-8818-e433c50beff1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.112106 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.112387 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2704a13f-1433-4804-8818-e433c50beff1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.112408 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgpr8\" (UniqueName: \"kubernetes.io/projected/2704a13f-1433-4804-8818-e433c50beff1-kube-api-access-pgpr8\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.112419 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2704a13f-1433-4804-8818-e433c50beff1-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.213938 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-config-data-custom\") pod \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.213980 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-scripts\") pod \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.214008 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-logs\") pod \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.214049 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-config-data\") pod \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.214068 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-etc-machine-id\") pod \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.214192 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-combined-ca-bundle\") pod \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.214246 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm4hd\" (UniqueName: \"kubernetes.io/projected/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-kube-api-access-qm4hd\") pod \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\" (UID: \"6a0215ab-b564-47d7-a2fc-f32bd768fa8e\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.214256 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6a0215ab-b564-47d7-a2fc-f32bd768fa8e" (UID: "6a0215ab-b564-47d7-a2fc-f32bd768fa8e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.214528 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-logs" (OuterVolumeSpecName: "logs") pod "6a0215ab-b564-47d7-a2fc-f32bd768fa8e" (UID: "6a0215ab-b564-47d7-a2fc-f32bd768fa8e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.215207 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.215223 4563 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.217903 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-scripts" (OuterVolumeSpecName: "scripts") pod "6a0215ab-b564-47d7-a2fc-f32bd768fa8e" (UID: "6a0215ab-b564-47d7-a2fc-f32bd768fa8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.219739 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-kube-api-access-qm4hd" (OuterVolumeSpecName: "kube-api-access-qm4hd") pod "6a0215ab-b564-47d7-a2fc-f32bd768fa8e" (UID: "6a0215ab-b564-47d7-a2fc-f32bd768fa8e"). InnerVolumeSpecName "kube-api-access-qm4hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.222742 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6a0215ab-b564-47d7-a2fc-f32bd768fa8e" (UID: "6a0215ab-b564-47d7-a2fc-f32bd768fa8e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.238951 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a0215ab-b564-47d7-a2fc-f32bd768fa8e" (UID: "6a0215ab-b564-47d7-a2fc-f32bd768fa8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.256564 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-config-data" (OuterVolumeSpecName: "config-data") pod "6a0215ab-b564-47d7-a2fc-f32bd768fa8e" (UID: "6a0215ab-b564-47d7-a2fc-f32bd768fa8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.318694 4563 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.318757 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.318783 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.318793 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.318805 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm4hd\" (UniqueName: \"kubernetes.io/projected/6a0215ab-b564-47d7-a2fc-f32bd768fa8e-kube-api-access-qm4hd\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.550888 4563 generic.go:334] "Generic (PLEG): container finished" podID="6a0215ab-b564-47d7-a2fc-f32bd768fa8e" containerID="a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63" exitCode=0 Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.551295 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a0215ab-b564-47d7-a2fc-f32bd768fa8e","Type":"ContainerDied","Data":"a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63"} Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.551339 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a0215ab-b564-47d7-a2fc-f32bd768fa8e","Type":"ContainerDied","Data":"288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c"} Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.551073 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.551308 4563 generic.go:334] "Generic (PLEG): container finished" podID="6a0215ab-b564-47d7-a2fc-f32bd768fa8e" containerID="288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c" exitCode=143 Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.551804 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a0215ab-b564-47d7-a2fc-f32bd768fa8e","Type":"ContainerDied","Data":"92c191215d90c713706d38f5c1f77f64db767073608344f830e1971070165962"} Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.551848 4563 scope.go:117] "RemoveContainer" containerID="a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.558480 4563 generic.go:334] "Generic (PLEG): container finished" podID="2704a13f-1433-4804-8818-e433c50beff1" containerID="92d72b407fd77954e41bd4d805f09771f3a11466e8cc0d004866b72dfb651b10" exitCode=0 Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.558537 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6r55" event={"ID":"2704a13f-1433-4804-8818-e433c50beff1","Type":"ContainerDied","Data":"92d72b407fd77954e41bd4d805f09771f3a11466e8cc0d004866b72dfb651b10"} Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.558565 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6r55" event={"ID":"2704a13f-1433-4804-8818-e433c50beff1","Type":"ContainerDied","Data":"8ddc29984564c821f8ff87fc4f5f775f8ca363365a56521da52a3dc0faf9db87"} Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.558577 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6r55" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.563243 4563 generic.go:334] "Generic (PLEG): container finished" podID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerID="714c30487c4e7dfc7a7cf3f6168922dc32da768f7e7dbe76847800c55047ae48" exitCode=0 Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.563357 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b995b5b3-41b0-4334-9f7c-792a50e780e7","Type":"ContainerDied","Data":"714c30487c4e7dfc7a7cf3f6168922dc32da768f7e7dbe76847800c55047ae48"} Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.587956 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.594176 4563 scope.go:117] "RemoveContainer" containerID="288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.639047 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.672362 4563 scope.go:117] "RemoveContainer" containerID="a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63" Nov 24 09:19:17 crc kubenswrapper[4563]: E1124 09:19:17.679514 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63\": container with ID starting with a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63 not found: ID does not exist" containerID="a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.679581 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63"} err="failed to get container status \"a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63\": rpc error: code = NotFound desc = could not find container \"a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63\": container with ID starting with a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63 not found: ID does not exist" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.679609 4563 scope.go:117] "RemoveContainer" containerID="288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c" Nov 24 09:19:17 crc kubenswrapper[4563]: E1124 09:19:17.683786 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c\": container with ID starting with 288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c not found: ID does not exist" containerID="288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.683889 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c"} err="failed to get container status \"288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c\": rpc error: code = NotFound desc = could not find container \"288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c\": container with ID starting with 288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c not found: ID does not exist" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.683949 4563 scope.go:117] "RemoveContainer" containerID="a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.684150 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.684628 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63"} err="failed to get container status \"a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63\": rpc error: code = NotFound desc = could not find container \"a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63\": container with ID starting with a0dbfdd103d6294db5d7f20efefc16935541f1f86a83b94d0bb2a31d2fd00d63 not found: ID does not exist" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.684673 4563 scope.go:117] "RemoveContainer" containerID="288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.689042 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c"} err="failed to get container status \"288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c\": rpc error: code = NotFound desc = could not find container \"288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c\": container with ID starting with 288d0ff70837f8b9427b993cd6183d6369dfc0c85994139227891113a194998c not found: ID does not exist" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.689081 4563 scope.go:117] "RemoveContainer" containerID="92d72b407fd77954e41bd4d805f09771f3a11466e8cc0d004866b72dfb651b10" Nov 24 09:19:17 crc kubenswrapper[4563]: E1124 09:19:17.692165 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2704a13f-1433-4804-8818-e433c50beff1" containerName="extract-utilities" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.692206 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2704a13f-1433-4804-8818-e433c50beff1" containerName="extract-utilities" Nov 24 09:19:17 crc kubenswrapper[4563]: E1124 09:19:17.692219 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0215ab-b564-47d7-a2fc-f32bd768fa8e" containerName="cinder-api" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.692226 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0215ab-b564-47d7-a2fc-f32bd768fa8e" containerName="cinder-api" Nov 24 09:19:17 crc kubenswrapper[4563]: E1124 09:19:17.692287 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0215ab-b564-47d7-a2fc-f32bd768fa8e" containerName="cinder-api-log" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.692295 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0215ab-b564-47d7-a2fc-f32bd768fa8e" containerName="cinder-api-log" Nov 24 09:19:17 crc kubenswrapper[4563]: E1124 09:19:17.692323 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2704a13f-1433-4804-8818-e433c50beff1" containerName="registry-server" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.692330 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2704a13f-1433-4804-8818-e433c50beff1" containerName="registry-server" Nov 24 09:19:17 crc kubenswrapper[4563]: E1124 09:19:17.692376 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2704a13f-1433-4804-8818-e433c50beff1" containerName="extract-content" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.692384 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2704a13f-1433-4804-8818-e433c50beff1" containerName="extract-content" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.702771 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="2704a13f-1433-4804-8818-e433c50beff1" containerName="registry-server" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.702820 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a0215ab-b564-47d7-a2fc-f32bd768fa8e" containerName="cinder-api-log" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.702878 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a0215ab-b564-47d7-a2fc-f32bd768fa8e" containerName="cinder-api" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.705001 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.708989 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r6r55"] Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.715334 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.715587 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.723900 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.724148 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.740482 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r6r55"] Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.765273 4563 scope.go:117] "RemoveContainer" containerID="5de5020a45d59de2be95c0099ba39e7e4646bfc7d66cbcf5a35f369521aa9761" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.778849 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.835085 4563 scope.go:117] "RemoveContainer" containerID="220ae75c298fb1ac368ca98f4b5b5c18864c721087fb935d1b679336694d8be5" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.842522 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-scripts\") pod \"b995b5b3-41b0-4334-9f7c-792a50e780e7\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.842602 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plvfd\" (UniqueName: \"kubernetes.io/projected/b995b5b3-41b0-4334-9f7c-792a50e780e7-kube-api-access-plvfd\") pod \"b995b5b3-41b0-4334-9f7c-792a50e780e7\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.842717 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-combined-ca-bundle\") pod \"b995b5b3-41b0-4334-9f7c-792a50e780e7\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.844187 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-sg-core-conf-yaml\") pod \"b995b5b3-41b0-4334-9f7c-792a50e780e7\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.844272 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b995b5b3-41b0-4334-9f7c-792a50e780e7-log-httpd\") pod \"b995b5b3-41b0-4334-9f7c-792a50e780e7\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.844312 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b995b5b3-41b0-4334-9f7c-792a50e780e7-run-httpd\") pod \"b995b5b3-41b0-4334-9f7c-792a50e780e7\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.844365 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-config-data\") pod \"b995b5b3-41b0-4334-9f7c-792a50e780e7\" (UID: \"b995b5b3-41b0-4334-9f7c-792a50e780e7\") " Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.845317 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd5367ce-55f6-4685-b414-4ef54ce7df7a-logs\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.845415 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6txgx\" (UniqueName: \"kubernetes.io/projected/dd5367ce-55f6-4685-b414-4ef54ce7df7a-kube-api-access-6txgx\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.845468 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.845601 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-scripts\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.845627 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd5367ce-55f6-4685-b414-4ef54ce7df7a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.846425 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-config-data-custom\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.846552 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.846577 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.846613 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-config-data\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.847705 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b995b5b3-41b0-4334-9f7c-792a50e780e7-kube-api-access-plvfd" (OuterVolumeSpecName: "kube-api-access-plvfd") pod "b995b5b3-41b0-4334-9f7c-792a50e780e7" (UID: "b995b5b3-41b0-4334-9f7c-792a50e780e7"). InnerVolumeSpecName "kube-api-access-plvfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.847905 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b995b5b3-41b0-4334-9f7c-792a50e780e7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b995b5b3-41b0-4334-9f7c-792a50e780e7" (UID: "b995b5b3-41b0-4334-9f7c-792a50e780e7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.848072 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-scripts" (OuterVolumeSpecName: "scripts") pod "b995b5b3-41b0-4334-9f7c-792a50e780e7" (UID: "b995b5b3-41b0-4334-9f7c-792a50e780e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.848225 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b995b5b3-41b0-4334-9f7c-792a50e780e7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b995b5b3-41b0-4334-9f7c-792a50e780e7" (UID: "b995b5b3-41b0-4334-9f7c-792a50e780e7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.875791 4563 scope.go:117] "RemoveContainer" containerID="92d72b407fd77954e41bd4d805f09771f3a11466e8cc0d004866b72dfb651b10" Nov 24 09:19:17 crc kubenswrapper[4563]: E1124 09:19:17.876293 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92d72b407fd77954e41bd4d805f09771f3a11466e8cc0d004866b72dfb651b10\": container with ID starting with 92d72b407fd77954e41bd4d805f09771f3a11466e8cc0d004866b72dfb651b10 not found: ID does not exist" containerID="92d72b407fd77954e41bd4d805f09771f3a11466e8cc0d004866b72dfb651b10" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.876350 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92d72b407fd77954e41bd4d805f09771f3a11466e8cc0d004866b72dfb651b10"} err="failed to get container status \"92d72b407fd77954e41bd4d805f09771f3a11466e8cc0d004866b72dfb651b10\": rpc error: code = NotFound desc = could not find container \"92d72b407fd77954e41bd4d805f09771f3a11466e8cc0d004866b72dfb651b10\": container with ID starting with 92d72b407fd77954e41bd4d805f09771f3a11466e8cc0d004866b72dfb651b10 not found: ID does not exist" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.876384 4563 scope.go:117] "RemoveContainer" containerID="5de5020a45d59de2be95c0099ba39e7e4646bfc7d66cbcf5a35f369521aa9761" Nov 24 09:19:17 crc kubenswrapper[4563]: E1124 09:19:17.877412 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5de5020a45d59de2be95c0099ba39e7e4646bfc7d66cbcf5a35f369521aa9761\": container with ID starting with 5de5020a45d59de2be95c0099ba39e7e4646bfc7d66cbcf5a35f369521aa9761 not found: ID does not exist" containerID="5de5020a45d59de2be95c0099ba39e7e4646bfc7d66cbcf5a35f369521aa9761" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.877453 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de5020a45d59de2be95c0099ba39e7e4646bfc7d66cbcf5a35f369521aa9761"} err="failed to get container status \"5de5020a45d59de2be95c0099ba39e7e4646bfc7d66cbcf5a35f369521aa9761\": rpc error: code = NotFound desc = could not find container \"5de5020a45d59de2be95c0099ba39e7e4646bfc7d66cbcf5a35f369521aa9761\": container with ID starting with 5de5020a45d59de2be95c0099ba39e7e4646bfc7d66cbcf5a35f369521aa9761 not found: ID does not exist" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.877470 4563 scope.go:117] "RemoveContainer" containerID="220ae75c298fb1ac368ca98f4b5b5c18864c721087fb935d1b679336694d8be5" Nov 24 09:19:17 crc kubenswrapper[4563]: E1124 09:19:17.877704 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"220ae75c298fb1ac368ca98f4b5b5c18864c721087fb935d1b679336694d8be5\": container with ID starting with 220ae75c298fb1ac368ca98f4b5b5c18864c721087fb935d1b679336694d8be5 not found: ID does not exist" containerID="220ae75c298fb1ac368ca98f4b5b5c18864c721087fb935d1b679336694d8be5" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.877739 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220ae75c298fb1ac368ca98f4b5b5c18864c721087fb935d1b679336694d8be5"} err="failed to get container status \"220ae75c298fb1ac368ca98f4b5b5c18864c721087fb935d1b679336694d8be5\": rpc error: code = NotFound desc = could not find container \"220ae75c298fb1ac368ca98f4b5b5c18864c721087fb935d1b679336694d8be5\": container with ID starting with 220ae75c298fb1ac368ca98f4b5b5c18864c721087fb935d1b679336694d8be5 not found: ID does not exist" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.880742 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b995b5b3-41b0-4334-9f7c-792a50e780e7" (UID: "b995b5b3-41b0-4334-9f7c-792a50e780e7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.906183 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b995b5b3-41b0-4334-9f7c-792a50e780e7" (UID: "b995b5b3-41b0-4334-9f7c-792a50e780e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.924827 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-config-data" (OuterVolumeSpecName: "config-data") pod "b995b5b3-41b0-4334-9f7c-792a50e780e7" (UID: "b995b5b3-41b0-4334-9f7c-792a50e780e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948264 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6txgx\" (UniqueName: \"kubernetes.io/projected/dd5367ce-55f6-4685-b414-4ef54ce7df7a-kube-api-access-6txgx\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948322 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948384 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-scripts\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948403 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd5367ce-55f6-4685-b414-4ef54ce7df7a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948423 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-config-data-custom\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948465 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948481 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948501 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-config-data\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948549 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd5367ce-55f6-4685-b414-4ef54ce7df7a-logs\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948572 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd5367ce-55f6-4685-b414-4ef54ce7df7a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948602 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948613 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plvfd\" (UniqueName: \"kubernetes.io/projected/b995b5b3-41b0-4334-9f7c-792a50e780e7-kube-api-access-plvfd\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948625 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948633 4563 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948655 4563 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b995b5b3-41b0-4334-9f7c-792a50e780e7-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948662 4563 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b995b5b3-41b0-4334-9f7c-792a50e780e7-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.948670 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b995b5b3-41b0-4334-9f7c-792a50e780e7-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.949036 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd5367ce-55f6-4685-b414-4ef54ce7df7a-logs\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.952811 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.952993 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.953117 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-scripts\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.953601 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.954893 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-config-data-custom\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.956433 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd5367ce-55f6-4685-b414-4ef54ce7df7a-config-data\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:17 crc kubenswrapper[4563]: I1124 09:19:17.966095 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6txgx\" (UniqueName: \"kubernetes.io/projected/dd5367ce-55f6-4685-b414-4ef54ce7df7a-kube-api-access-6txgx\") pod \"cinder-api-0\" (UID: \"dd5367ce-55f6-4685-b414-4ef54ce7df7a\") " pod="openstack/cinder-api-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.123533 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.133760 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-b876f5fb4-sx5lp"] Nov 24 09:19:18 crc kubenswrapper[4563]: E1124 09:19:18.134116 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerName="ceilometer-notification-agent" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.134135 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerName="ceilometer-notification-agent" Nov 24 09:19:18 crc kubenswrapper[4563]: E1124 09:19:18.134147 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerName="sg-core" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.134155 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerName="sg-core" Nov 24 09:19:18 crc kubenswrapper[4563]: E1124 09:19:18.134165 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerName="proxy-httpd" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.134171 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerName="proxy-httpd" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.134334 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerName="sg-core" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.134361 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerName="proxy-httpd" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.134384 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="b995b5b3-41b0-4334-9f7c-792a50e780e7" containerName="ceilometer-notification-agent" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.135255 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.140981 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.141163 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.160146 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b876f5fb4-sx5lp"] Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.264409 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29588100-1198-4e82-a1c3-87d27b71aa65-logs\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.264558 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-config-data\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.264608 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-public-tls-certs\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.264952 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-config-data-custom\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.265020 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-combined-ca-bundle\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.265064 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-internal-tls-certs\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.265175 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msb4c\" (UniqueName: \"kubernetes.io/projected/29588100-1198-4e82-a1c3-87d27b71aa65-kube-api-access-msb4c\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.367288 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-config-data\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.367540 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-public-tls-certs\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.367600 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-config-data-custom\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.367631 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-combined-ca-bundle\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.367671 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-internal-tls-certs\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.367727 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msb4c\" (UniqueName: \"kubernetes.io/projected/29588100-1198-4e82-a1c3-87d27b71aa65-kube-api-access-msb4c\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.367768 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29588100-1198-4e82-a1c3-87d27b71aa65-logs\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.368186 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29588100-1198-4e82-a1c3-87d27b71aa65-logs\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.372960 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-config-data-custom\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.373112 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-internal-tls-certs\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.373971 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-config-data\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.374448 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-combined-ca-bundle\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.388553 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msb4c\" (UniqueName: \"kubernetes.io/projected/29588100-1198-4e82-a1c3-87d27b71aa65-kube-api-access-msb4c\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.397971 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29588100-1198-4e82-a1c3-87d27b71aa65-public-tls-certs\") pod \"barbican-api-b876f5fb4-sx5lp\" (UID: \"29588100-1198-4e82-a1c3-87d27b71aa65\") " pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.457143 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.591712 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.592982 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b995b5b3-41b0-4334-9f7c-792a50e780e7","Type":"ContainerDied","Data":"395027ed71257e44b947a95a94e94bfd854681ce9148cbec23cb43999743c9a0"} Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.593049 4563 scope.go:117] "RemoveContainer" containerID="3415a060233e9b99461d2d14ff354bed25e3e9025c2ceeeda7ae7692d24374e4" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.593197 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: W1124 09:19:18.603823 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd5367ce_55f6_4685_b414_4ef54ce7df7a.slice/crio-c77e8a4f17369b44e17c56f791531af23b334c9859c1bf52b1fe6f52990043cf WatchSource:0}: Error finding container c77e8a4f17369b44e17c56f791531af23b334c9859c1bf52b1fe6f52990043cf: Status 404 returned error can't find the container with id c77e8a4f17369b44e17c56f791531af23b334c9859c1bf52b1fe6f52990043cf Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.674796 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.680695 4563 scope.go:117] "RemoveContainer" containerID="3991af0b0f902f2e913b758214f68941b05dd8ebc1d3590242e7b910416be4c7" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.684400 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.698953 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.720000 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.722500 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.737084 4563 scope.go:117] "RemoveContainer" containerID="714c30487c4e7dfc7a7cf3f6168922dc32da768f7e7dbe76847800c55047ae48" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.737216 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.737452 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.738264 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.860516 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6d999bbd6-cqj6s" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.882787 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3effc21a-ac22-4712-ae88-2b318473ccee-run-httpd\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.882879 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wg2r\" (UniqueName: \"kubernetes.io/projected/3effc21a-ac22-4712-ae88-2b318473ccee-kube-api-access-7wg2r\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.882924 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-config-data\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.883046 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-scripts\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.883075 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.883095 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.883134 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3effc21a-ac22-4712-ae88-2b318473ccee-log-httpd\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.925862 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cd5c59c66-hrmf5"] Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.943554 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b876f5fb4-sx5lp"] Nov 24 09:19:18 crc kubenswrapper[4563]: W1124 09:19:18.965820 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29588100_1198_4e82_a1c3_87d27b71aa65.slice/crio-5751af57848c9b0d6541859f746d3f0e0edae16cf38506eb6f35f457bcf96c36 WatchSource:0}: Error finding container 5751af57848c9b0d6541859f746d3f0e0edae16cf38506eb6f35f457bcf96c36: Status 404 returned error can't find the container with id 5751af57848c9b0d6541859f746d3f0e0edae16cf38506eb6f35f457bcf96c36 Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.984230 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-config-data\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.984342 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-scripts\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.984367 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.984397 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.984421 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3effc21a-ac22-4712-ae88-2b318473ccee-log-httpd\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.984472 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3effc21a-ac22-4712-ae88-2b318473ccee-run-httpd\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.984497 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wg2r\" (UniqueName: \"kubernetes.io/projected/3effc21a-ac22-4712-ae88-2b318473ccee-kube-api-access-7wg2r\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.987772 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3effc21a-ac22-4712-ae88-2b318473ccee-log-httpd\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.989123 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3effc21a-ac22-4712-ae88-2b318473ccee-run-httpd\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.993055 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.996161 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-scripts\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.996324 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-config-data\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:18 crc kubenswrapper[4563]: I1124 09:19:18.996359 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.008148 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wg2r\" (UniqueName: \"kubernetes.io/projected/3effc21a-ac22-4712-ae88-2b318473ccee-kube-api-access-7wg2r\") pod \"ceilometer-0\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " pod="openstack/ceilometer-0" Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.064609 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.070083 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2704a13f-1433-4804-8818-e433c50beff1" path="/var/lib/kubelet/pods/2704a13f-1433-4804-8818-e433c50beff1/volumes" Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.070993 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a0215ab-b564-47d7-a2fc-f32bd768fa8e" path="/var/lib/kubelet/pods/6a0215ab-b564-47d7-a2fc-f32bd768fa8e/volumes" Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.077125 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b995b5b3-41b0-4334-9f7c-792a50e780e7" path="/var/lib/kubelet/pods/b995b5b3-41b0-4334-9f7c-792a50e780e7/volumes" Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.379003 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.419552 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.596102 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.647029 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b876f5fb4-sx5lp" event={"ID":"29588100-1198-4e82-a1c3-87d27b71aa65","Type":"ContainerStarted","Data":"1384fc22c8bfc625e5ce81aee626bf75780cec251cce2f4496e8fef4f482cd07"} Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.647094 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b876f5fb4-sx5lp" event={"ID":"29588100-1198-4e82-a1c3-87d27b71aa65","Type":"ContainerStarted","Data":"6a01882dd510ccf98e79f5e5ed9bbaf82557ac5409c561c807c59a4bf5604d34"} Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.647113 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b876f5fb4-sx5lp" event={"ID":"29588100-1198-4e82-a1c3-87d27b71aa65","Type":"ContainerStarted","Data":"5751af57848c9b0d6541859f746d3f0e0edae16cf38506eb6f35f457bcf96c36"} Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.648011 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.689349 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dd5367ce-55f6-4685-b414-4ef54ce7df7a","Type":"ContainerStarted","Data":"65572cd313eb5773636486a95e08b273ac18aef40262fc418d721bbd48b41d6a"} Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.689426 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dd5367ce-55f6-4685-b414-4ef54ce7df7a","Type":"ContainerStarted","Data":"c77e8a4f17369b44e17c56f791531af23b334c9859c1bf52b1fe6f52990043cf"} Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.689412 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cd5c59c66-hrmf5" podUID="0ec5b651-57ef-414b-8c8e-4b488d71663f" containerName="horizon-log" containerID="cri-o://babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209" gracePeriod=30 Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.689545 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cd5c59c66-hrmf5" podUID="0ec5b651-57ef-414b-8c8e-4b488d71663f" containerName="horizon" containerID="cri-o://580a67c88349862289ccfdb112589708d48749b86e526a80925a6ba1dc67dab0" gracePeriod=30 Nov 24 09:19:19 crc kubenswrapper[4563]: I1124 09:19:19.706213 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-b876f5fb4-sx5lp" podStartSLOduration=1.706190479 podStartE2EDuration="1.706190479s" podCreationTimestamp="2025-11-24 09:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:19:19.682009293 +0000 UTC m=+936.940986740" watchObservedRunningTime="2025-11-24 09:19:19.706190479 +0000 UTC m=+936.965167926" Nov 24 09:19:20 crc kubenswrapper[4563]: I1124 09:19:20.699009 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dd5367ce-55f6-4685-b414-4ef54ce7df7a","Type":"ContainerStarted","Data":"80ead4e79f65ef32d87ad098ec0ef424ca9d29f15c34e0591d00bcd81bf3360d"} Nov 24 09:19:20 crc kubenswrapper[4563]: I1124 09:19:20.699362 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 24 09:19:20 crc kubenswrapper[4563]: I1124 09:19:20.700713 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3effc21a-ac22-4712-ae88-2b318473ccee","Type":"ContainerStarted","Data":"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16"} Nov 24 09:19:20 crc kubenswrapper[4563]: I1124 09:19:20.700738 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3effc21a-ac22-4712-ae88-2b318473ccee","Type":"ContainerStarted","Data":"b6fde3917e7613a43493a7393ea586cbe9c7a5b335b6ea18d785d6b8c490ce34"} Nov 24 09:19:20 crc kubenswrapper[4563]: I1124 09:19:20.700886 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:21 crc kubenswrapper[4563]: I1124 09:19:21.192398 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:21 crc kubenswrapper[4563]: I1124 09:19:21.217654 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.217626092 podStartE2EDuration="4.217626092s" podCreationTimestamp="2025-11-24 09:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:19:20.717807543 +0000 UTC m=+937.976784990" watchObservedRunningTime="2025-11-24 09:19:21.217626092 +0000 UTC m=+938.476603540" Nov 24 09:19:21 crc kubenswrapper[4563]: I1124 09:19:21.649762 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-85d84cd957-f2sp9" Nov 24 09:19:21 crc kubenswrapper[4563]: I1124 09:19:21.709288 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6ffbb5d-hrkmt"] Nov 24 09:19:21 crc kubenswrapper[4563]: I1124 09:19:21.709494 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6ffbb5d-hrkmt" podUID="80a0c11a-54b3-4823-86c1-8b48b37b46e4" containerName="neutron-api" containerID="cri-o://6d3a5c2335736b9cafe2fc177ead532e75faac165ee72181e066a3aeeb0e1aff" gracePeriod=30 Nov 24 09:19:21 crc kubenswrapper[4563]: I1124 09:19:21.709836 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6ffbb5d-hrkmt" podUID="80a0c11a-54b3-4823-86c1-8b48b37b46e4" containerName="neutron-httpd" containerID="cri-o://f49b10e8d89be654f65a1d0c8844563f6bd522c3bef7627f789f59fafff60484" gracePeriod=30 Nov 24 09:19:21 crc kubenswrapper[4563]: I1124 09:19:21.741975 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3effc21a-ac22-4712-ae88-2b318473ccee","Type":"ContainerStarted","Data":"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50"} Nov 24 09:19:22 crc kubenswrapper[4563]: I1124 09:19:22.111211 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 24 09:19:22 crc kubenswrapper[4563]: I1124 09:19:22.153225 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 09:19:22 crc kubenswrapper[4563]: I1124 09:19:22.498535 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:19:22 crc kubenswrapper[4563]: I1124 09:19:22.639655 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c654c9745-9ss87"] Nov 24 09:19:22 crc kubenswrapper[4563]: I1124 09:19:22.639959 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c654c9745-9ss87" podUID="1213cb1f-843a-4e6c-b56c-cf39c8108614" containerName="dnsmasq-dns" containerID="cri-o://0439d1adb4816d7c44e31ae913ed92ed69ad6ef565afbc090fd1aa5fea29fa3f" gracePeriod=10 Nov 24 09:19:22 crc kubenswrapper[4563]: I1124 09:19:22.809027 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3effc21a-ac22-4712-ae88-2b318473ccee","Type":"ContainerStarted","Data":"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc"} Nov 24 09:19:22 crc kubenswrapper[4563]: I1124 09:19:22.817780 4563 generic.go:334] "Generic (PLEG): container finished" podID="80a0c11a-54b3-4823-86c1-8b48b37b46e4" containerID="f49b10e8d89be654f65a1d0c8844563f6bd522c3bef7627f789f59fafff60484" exitCode=0 Nov 24 09:19:22 crc kubenswrapper[4563]: I1124 09:19:22.817859 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffbb5d-hrkmt" event={"ID":"80a0c11a-54b3-4823-86c1-8b48b37b46e4","Type":"ContainerDied","Data":"f49b10e8d89be654f65a1d0c8844563f6bd522c3bef7627f789f59fafff60484"} Nov 24 09:19:22 crc kubenswrapper[4563]: I1124 09:19:22.817984 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0c44e09d-b8ec-401c-886e-c4e2c589778d" containerName="cinder-scheduler" containerID="cri-o://85d9e9edf50dbd2445fbe782603b12460572db0d385d8f922ad67f5bdd1e8548" gracePeriod=30 Nov 24 09:19:22 crc kubenswrapper[4563]: I1124 09:19:22.818296 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0c44e09d-b8ec-401c-886e-c4e2c589778d" containerName="probe" containerID="cri-o://28c6f9f3742b86022ae1f8f63ea3f21410a0a3fde2b639019fdfba51ced1f206" gracePeriod=30 Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.493622 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.613203 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-ovsdbserver-sb\") pod \"1213cb1f-843a-4e6c-b56c-cf39c8108614\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.613397 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-ovsdbserver-nb\") pod \"1213cb1f-843a-4e6c-b56c-cf39c8108614\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.613461 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-config\") pod \"1213cb1f-843a-4e6c-b56c-cf39c8108614\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.613559 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-dns-swift-storage-0\") pod \"1213cb1f-843a-4e6c-b56c-cf39c8108614\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.613702 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nctdc\" (UniqueName: \"kubernetes.io/projected/1213cb1f-843a-4e6c-b56c-cf39c8108614-kube-api-access-nctdc\") pod \"1213cb1f-843a-4e6c-b56c-cf39c8108614\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.613731 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-dns-svc\") pod \"1213cb1f-843a-4e6c-b56c-cf39c8108614\" (UID: \"1213cb1f-843a-4e6c-b56c-cf39c8108614\") " Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.619060 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1213cb1f-843a-4e6c-b56c-cf39c8108614-kube-api-access-nctdc" (OuterVolumeSpecName: "kube-api-access-nctdc") pod "1213cb1f-843a-4e6c-b56c-cf39c8108614" (UID: "1213cb1f-843a-4e6c-b56c-cf39c8108614"). InnerVolumeSpecName "kube-api-access-nctdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.656266 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1213cb1f-843a-4e6c-b56c-cf39c8108614" (UID: "1213cb1f-843a-4e6c-b56c-cf39c8108614"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.658547 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-config" (OuterVolumeSpecName: "config") pod "1213cb1f-843a-4e6c-b56c-cf39c8108614" (UID: "1213cb1f-843a-4e6c-b56c-cf39c8108614"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.659368 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1213cb1f-843a-4e6c-b56c-cf39c8108614" (UID: "1213cb1f-843a-4e6c-b56c-cf39c8108614"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.672434 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1213cb1f-843a-4e6c-b56c-cf39c8108614" (UID: "1213cb1f-843a-4e6c-b56c-cf39c8108614"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.680067 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1213cb1f-843a-4e6c-b56c-cf39c8108614" (UID: "1213cb1f-843a-4e6c-b56c-cf39c8108614"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.715997 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.716028 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.716038 4563 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.716047 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nctdc\" (UniqueName: \"kubernetes.io/projected/1213cb1f-843a-4e6c-b56c-cf39c8108614-kube-api-access-nctdc\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.716057 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.716064 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1213cb1f-843a-4e6c-b56c-cf39c8108614-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.827353 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3effc21a-ac22-4712-ae88-2b318473ccee","Type":"ContainerStarted","Data":"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4"} Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.827480 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.829412 4563 generic.go:334] "Generic (PLEG): container finished" podID="0ec5b651-57ef-414b-8c8e-4b488d71663f" containerID="580a67c88349862289ccfdb112589708d48749b86e526a80925a6ba1dc67dab0" exitCode=0 Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.829464 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd5c59c66-hrmf5" event={"ID":"0ec5b651-57ef-414b-8c8e-4b488d71663f","Type":"ContainerDied","Data":"580a67c88349862289ccfdb112589708d48749b86e526a80925a6ba1dc67dab0"} Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.834267 4563 generic.go:334] "Generic (PLEG): container finished" podID="0c44e09d-b8ec-401c-886e-c4e2c589778d" containerID="28c6f9f3742b86022ae1f8f63ea3f21410a0a3fde2b639019fdfba51ced1f206" exitCode=0 Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.834358 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0c44e09d-b8ec-401c-886e-c4e2c589778d","Type":"ContainerDied","Data":"28c6f9f3742b86022ae1f8f63ea3f21410a0a3fde2b639019fdfba51ced1f206"} Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.835829 4563 generic.go:334] "Generic (PLEG): container finished" podID="1213cb1f-843a-4e6c-b56c-cf39c8108614" containerID="0439d1adb4816d7c44e31ae913ed92ed69ad6ef565afbc090fd1aa5fea29fa3f" exitCode=0 Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.835856 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c654c9745-9ss87" event={"ID":"1213cb1f-843a-4e6c-b56c-cf39c8108614","Type":"ContainerDied","Data":"0439d1adb4816d7c44e31ae913ed92ed69ad6ef565afbc090fd1aa5fea29fa3f"} Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.835872 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c654c9745-9ss87" event={"ID":"1213cb1f-843a-4e6c-b56c-cf39c8108614","Type":"ContainerDied","Data":"88c4743ed51784676e5184b5c9648b4b28318da5ed0b067bd21042c1ef70fa1a"} Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.835887 4563 scope.go:117] "RemoveContainer" containerID="0439d1adb4816d7c44e31ae913ed92ed69ad6ef565afbc090fd1aa5fea29fa3f" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.835976 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c654c9745-9ss87" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.864860 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.437166301 podStartE2EDuration="5.864843226s" podCreationTimestamp="2025-11-24 09:19:18 +0000 UTC" firstStartedPulling="2025-11-24 09:19:19.624927797 +0000 UTC m=+936.883905243" lastFinishedPulling="2025-11-24 09:19:23.05260472 +0000 UTC m=+940.311582168" observedRunningTime="2025-11-24 09:19:23.859022178 +0000 UTC m=+941.117999625" watchObservedRunningTime="2025-11-24 09:19:23.864843226 +0000 UTC m=+941.123820673" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.884284 4563 scope.go:117] "RemoveContainer" containerID="2d3d1a70d147bfdd5f33d5722dc44f45ebce5578c7d57d9e346a08f1e4e200bd" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.887014 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c654c9745-9ss87"] Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.892777 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c654c9745-9ss87"] Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.914474 4563 scope.go:117] "RemoveContainer" containerID="0439d1adb4816d7c44e31ae913ed92ed69ad6ef565afbc090fd1aa5fea29fa3f" Nov 24 09:19:23 crc kubenswrapper[4563]: E1124 09:19:23.914900 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0439d1adb4816d7c44e31ae913ed92ed69ad6ef565afbc090fd1aa5fea29fa3f\": container with ID starting with 0439d1adb4816d7c44e31ae913ed92ed69ad6ef565afbc090fd1aa5fea29fa3f not found: ID does not exist" containerID="0439d1adb4816d7c44e31ae913ed92ed69ad6ef565afbc090fd1aa5fea29fa3f" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.914940 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0439d1adb4816d7c44e31ae913ed92ed69ad6ef565afbc090fd1aa5fea29fa3f"} err="failed to get container status \"0439d1adb4816d7c44e31ae913ed92ed69ad6ef565afbc090fd1aa5fea29fa3f\": rpc error: code = NotFound desc = could not find container \"0439d1adb4816d7c44e31ae913ed92ed69ad6ef565afbc090fd1aa5fea29fa3f\": container with ID starting with 0439d1adb4816d7c44e31ae913ed92ed69ad6ef565afbc090fd1aa5fea29fa3f not found: ID does not exist" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.914963 4563 scope.go:117] "RemoveContainer" containerID="2d3d1a70d147bfdd5f33d5722dc44f45ebce5578c7d57d9e346a08f1e4e200bd" Nov 24 09:19:23 crc kubenswrapper[4563]: E1124 09:19:23.915423 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d3d1a70d147bfdd5f33d5722dc44f45ebce5578c7d57d9e346a08f1e4e200bd\": container with ID starting with 2d3d1a70d147bfdd5f33d5722dc44f45ebce5578c7d57d9e346a08f1e4e200bd not found: ID does not exist" containerID="2d3d1a70d147bfdd5f33d5722dc44f45ebce5578c7d57d9e346a08f1e4e200bd" Nov 24 09:19:23 crc kubenswrapper[4563]: I1124 09:19:23.915448 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d3d1a70d147bfdd5f33d5722dc44f45ebce5578c7d57d9e346a08f1e4e200bd"} err="failed to get container status \"2d3d1a70d147bfdd5f33d5722dc44f45ebce5578c7d57d9e346a08f1e4e200bd\": rpc error: code = NotFound desc = could not find container \"2d3d1a70d147bfdd5f33d5722dc44f45ebce5578c7d57d9e346a08f1e4e200bd\": container with ID starting with 2d3d1a70d147bfdd5f33d5722dc44f45ebce5578c7d57d9e346a08f1e4e200bd not found: ID does not exist" Nov 24 09:19:24 crc kubenswrapper[4563]: I1124 09:19:24.649950 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7cd5c59c66-hrmf5" podUID="0ec5b651-57ef-414b-8c8e-4b488d71663f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Nov 24 09:19:25 crc kubenswrapper[4563]: I1124 09:19:25.064810 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1213cb1f-843a-4e6c-b56c-cf39c8108614" path="/var/lib/kubelet/pods/1213cb1f-843a-4e6c-b56c-cf39c8108614/volumes" Nov 24 09:19:25 crc kubenswrapper[4563]: I1124 09:19:25.856369 4563 generic.go:334] "Generic (PLEG): container finished" podID="0c44e09d-b8ec-401c-886e-c4e2c589778d" containerID="85d9e9edf50dbd2445fbe782603b12460572db0d385d8f922ad67f5bdd1e8548" exitCode=0 Nov 24 09:19:25 crc kubenswrapper[4563]: I1124 09:19:25.856454 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0c44e09d-b8ec-401c-886e-c4e2c589778d","Type":"ContainerDied","Data":"85d9e9edf50dbd2445fbe782603b12460572db0d385d8f922ad67f5bdd1e8548"} Nov 24 09:19:25 crc kubenswrapper[4563]: I1124 09:19:25.923188 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.067531 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-scripts\") pod \"0c44e09d-b8ec-401c-886e-c4e2c589778d\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.067752 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-config-data-custom\") pod \"0c44e09d-b8ec-401c-886e-c4e2c589778d\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.067826 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qjdw\" (UniqueName: \"kubernetes.io/projected/0c44e09d-b8ec-401c-886e-c4e2c589778d-kube-api-access-5qjdw\") pod \"0c44e09d-b8ec-401c-886e-c4e2c589778d\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.067855 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c44e09d-b8ec-401c-886e-c4e2c589778d-etc-machine-id\") pod \"0c44e09d-b8ec-401c-886e-c4e2c589778d\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.067920 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-combined-ca-bundle\") pod \"0c44e09d-b8ec-401c-886e-c4e2c589778d\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.067947 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-config-data\") pod \"0c44e09d-b8ec-401c-886e-c4e2c589778d\" (UID: \"0c44e09d-b8ec-401c-886e-c4e2c589778d\") " Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.068080 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c44e09d-b8ec-401c-886e-c4e2c589778d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0c44e09d-b8ec-401c-886e-c4e2c589778d" (UID: "0c44e09d-b8ec-401c-886e-c4e2c589778d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.068423 4563 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0c44e09d-b8ec-401c-886e-c4e2c589778d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.116814 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0c44e09d-b8ec-401c-886e-c4e2c589778d" (UID: "0c44e09d-b8ec-401c-886e-c4e2c589778d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.121966 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-scripts" (OuterVolumeSpecName: "scripts") pod "0c44e09d-b8ec-401c-886e-c4e2c589778d" (UID: "0c44e09d-b8ec-401c-886e-c4e2c589778d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.142365 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c44e09d-b8ec-401c-886e-c4e2c589778d-kube-api-access-5qjdw" (OuterVolumeSpecName: "kube-api-access-5qjdw") pod "0c44e09d-b8ec-401c-886e-c4e2c589778d" (UID: "0c44e09d-b8ec-401c-886e-c4e2c589778d"). InnerVolumeSpecName "kube-api-access-5qjdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.177959 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.177991 4563 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.178000 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qjdw\" (UniqueName: \"kubernetes.io/projected/0c44e09d-b8ec-401c-886e-c4e2c589778d-kube-api-access-5qjdw\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.247823 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c44e09d-b8ec-401c-886e-c4e2c589778d" (UID: "0c44e09d-b8ec-401c-886e-c4e2c589778d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.283911 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.289862 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-config-data" (OuterVolumeSpecName: "config-data") pod "0c44e09d-b8ec-401c-886e-c4e2c589778d" (UID: "0c44e09d-b8ec-401c-886e-c4e2c589778d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.385742 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c44e09d-b8ec-401c-886e-c4e2c589778d-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.866998 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0c44e09d-b8ec-401c-886e-c4e2c589778d","Type":"ContainerDied","Data":"58166f2474d31f06367288962ee11279db40d91c22965a289eb50ef20b7bead4"} Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.867032 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.867059 4563 scope.go:117] "RemoveContainer" containerID="28c6f9f3742b86022ae1f8f63ea3f21410a0a3fde2b639019fdfba51ced1f206" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.888557 4563 scope.go:117] "RemoveContainer" containerID="85d9e9edf50dbd2445fbe782603b12460572db0d385d8f922ad67f5bdd1e8548" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.893622 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.918982 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.931058 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 09:19:26 crc kubenswrapper[4563]: E1124 09:19:26.931480 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1213cb1f-843a-4e6c-b56c-cf39c8108614" containerName="dnsmasq-dns" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.931497 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="1213cb1f-843a-4e6c-b56c-cf39c8108614" containerName="dnsmasq-dns" Nov 24 09:19:26 crc kubenswrapper[4563]: E1124 09:19:26.931525 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1213cb1f-843a-4e6c-b56c-cf39c8108614" containerName="init" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.931531 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="1213cb1f-843a-4e6c-b56c-cf39c8108614" containerName="init" Nov 24 09:19:26 crc kubenswrapper[4563]: E1124 09:19:26.931555 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c44e09d-b8ec-401c-886e-c4e2c589778d" containerName="cinder-scheduler" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.931563 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c44e09d-b8ec-401c-886e-c4e2c589778d" containerName="cinder-scheduler" Nov 24 09:19:26 crc kubenswrapper[4563]: E1124 09:19:26.931597 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c44e09d-b8ec-401c-886e-c4e2c589778d" containerName="probe" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.931604 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c44e09d-b8ec-401c-886e-c4e2c589778d" containerName="probe" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.931801 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c44e09d-b8ec-401c-886e-c4e2c589778d" containerName="probe" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.931817 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c44e09d-b8ec-401c-886e-c4e2c589778d" containerName="cinder-scheduler" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.931835 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="1213cb1f-843a-4e6c-b56c-cf39c8108614" containerName="dnsmasq-dns" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.932838 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.934500 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 24 09:19:26 crc kubenswrapper[4563]: I1124 09:19:26.953138 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.065913 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c44e09d-b8ec-401c-886e-c4e2c589778d" path="/var/lib/kubelet/pods/0c44e09d-b8ec-401c-886e-c4e2c589778d/volumes" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.100705 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.100741 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxk4c\" (UniqueName: \"kubernetes.io/projected/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-kube-api-access-pxk4c\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.100777 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.100821 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.100844 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-scripts\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.100908 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-config-data\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.201624 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.201678 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxk4c\" (UniqueName: \"kubernetes.io/projected/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-kube-api-access-pxk4c\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.201717 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.201765 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.201790 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-scripts\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.201820 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-config-data\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.203183 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.206633 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.207442 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.207456 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-config-data\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.217161 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-scripts\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.231213 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxk4c\" (UniqueName: \"kubernetes.io/projected/0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3-kube-api-access-pxk4c\") pod \"cinder-scheduler-0\" (UID: \"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3\") " pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.252425 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.639427 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 24 09:19:27 crc kubenswrapper[4563]: I1124 09:19:27.882037 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3","Type":"ContainerStarted","Data":"51aef519968647548f94e325d6e3d590b49ce25bda62bbe8785b7d15779673ab"} Nov 24 09:19:28 crc kubenswrapper[4563]: I1124 09:19:28.891726 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3","Type":"ContainerStarted","Data":"43d40b7f773186d48431dfc6aed9a14e4fa524931a391e0f5909372368931c1d"} Nov 24 09:19:28 crc kubenswrapper[4563]: I1124 09:19:28.891768 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3","Type":"ContainerStarted","Data":"fa8dea1fe95ab330c080635ae96d2f748c5ebf3a1ebaba43036d1e0ceae49a61"} Nov 24 09:19:28 crc kubenswrapper[4563]: I1124 09:19:28.907805 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.907787061 podStartE2EDuration="2.907787061s" podCreationTimestamp="2025-11-24 09:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:19:28.906947848 +0000 UTC m=+946.165925294" watchObservedRunningTime="2025-11-24 09:19:28.907787061 +0000 UTC m=+946.166764508" Nov 24 09:19:29 crc kubenswrapper[4563]: I1124 09:19:29.842699 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:29 crc kubenswrapper[4563]: I1124 09:19:29.980083 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6565cb8596-rhwtd" Nov 24 09:19:30 crc kubenswrapper[4563]: I1124 09:19:30.008664 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b876f5fb4-sx5lp" Nov 24 09:19:30 crc kubenswrapper[4563]: I1124 09:19:30.043001 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 24 09:19:30 crc kubenswrapper[4563]: I1124 09:19:30.068757 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-787479d464-54bqr"] Nov 24 09:19:30 crc kubenswrapper[4563]: I1124 09:19:30.069096 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-787479d464-54bqr" podUID="626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" containerName="barbican-api-log" containerID="cri-o://a0cf758c57b6cc78147c5f06280ea898ccd750ead778e95095d7f8d2a4da1b9b" gracePeriod=30 Nov 24 09:19:30 crc kubenswrapper[4563]: I1124 09:19:30.069553 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-787479d464-54bqr" podUID="626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" containerName="barbican-api" containerID="cri-o://8f017481fbaa8c5cb10be0c34acda2f08ea259d424d7ae7f802c7c4a2f91b14d" gracePeriod=30 Nov 24 09:19:30 crc kubenswrapper[4563]: I1124 09:19:30.907562 4563 generic.go:334] "Generic (PLEG): container finished" podID="626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" containerID="a0cf758c57b6cc78147c5f06280ea898ccd750ead778e95095d7f8d2a4da1b9b" exitCode=143 Nov 24 09:19:30 crc kubenswrapper[4563]: I1124 09:19:30.907654 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787479d464-54bqr" event={"ID":"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90","Type":"ContainerDied","Data":"a0cf758c57b6cc78147c5f06280ea898ccd750ead778e95095d7f8d2a4da1b9b"} Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.253982 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.499684 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.500862 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.502426 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.502575 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-hskfh" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.503400 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.505462 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.530029 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9b1ff524-d9dc-4433-a21c-f6d00e3b89d4-openstack-config-secret\") pod \"openstackclient\" (UID: \"9b1ff524-d9dc-4433-a21c-f6d00e3b89d4\") " pod="openstack/openstackclient" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.530067 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq2v7\" (UniqueName: \"kubernetes.io/projected/9b1ff524-d9dc-4433-a21c-f6d00e3b89d4-kube-api-access-nq2v7\") pod \"openstackclient\" (UID: \"9b1ff524-d9dc-4433-a21c-f6d00e3b89d4\") " pod="openstack/openstackclient" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.530111 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9b1ff524-d9dc-4433-a21c-f6d00e3b89d4-openstack-config\") pod \"openstackclient\" (UID: \"9b1ff524-d9dc-4433-a21c-f6d00e3b89d4\") " pod="openstack/openstackclient" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.530173 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1ff524-d9dc-4433-a21c-f6d00e3b89d4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9b1ff524-d9dc-4433-a21c-f6d00e3b89d4\") " pod="openstack/openstackclient" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.631838 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1ff524-d9dc-4433-a21c-f6d00e3b89d4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9b1ff524-d9dc-4433-a21c-f6d00e3b89d4\") " pod="openstack/openstackclient" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.631944 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9b1ff524-d9dc-4433-a21c-f6d00e3b89d4-openstack-config-secret\") pod \"openstackclient\" (UID: \"9b1ff524-d9dc-4433-a21c-f6d00e3b89d4\") " pod="openstack/openstackclient" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.631975 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq2v7\" (UniqueName: \"kubernetes.io/projected/9b1ff524-d9dc-4433-a21c-f6d00e3b89d4-kube-api-access-nq2v7\") pod \"openstackclient\" (UID: \"9b1ff524-d9dc-4433-a21c-f6d00e3b89d4\") " pod="openstack/openstackclient" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.632017 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9b1ff524-d9dc-4433-a21c-f6d00e3b89d4-openstack-config\") pod \"openstackclient\" (UID: \"9b1ff524-d9dc-4433-a21c-f6d00e3b89d4\") " pod="openstack/openstackclient" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.632980 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9b1ff524-d9dc-4433-a21c-f6d00e3b89d4-openstack-config\") pod \"openstackclient\" (UID: \"9b1ff524-d9dc-4433-a21c-f6d00e3b89d4\") " pod="openstack/openstackclient" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.637981 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1ff524-d9dc-4433-a21c-f6d00e3b89d4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9b1ff524-d9dc-4433-a21c-f6d00e3b89d4\") " pod="openstack/openstackclient" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.644989 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9b1ff524-d9dc-4433-a21c-f6d00e3b89d4-openstack-config-secret\") pod \"openstackclient\" (UID: \"9b1ff524-d9dc-4433-a21c-f6d00e3b89d4\") " pod="openstack/openstackclient" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.649115 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq2v7\" (UniqueName: \"kubernetes.io/projected/9b1ff524-d9dc-4433-a21c-f6d00e3b89d4-kube-api-access-nq2v7\") pod \"openstackclient\" (UID: \"9b1ff524-d9dc-4433-a21c-f6d00e3b89d4\") " pod="openstack/openstackclient" Nov 24 09:19:32 crc kubenswrapper[4563]: I1124 09:19:32.815103 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.226276 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-787479d464-54bqr" podUID="626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:48226->10.217.0.166:9311: read: connection reset by peer" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.226322 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-787479d464-54bqr" podUID="626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:48228->10.217.0.166:9311: read: connection reset by peer" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.258612 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 24 09:19:33 crc kubenswrapper[4563]: W1124 09:19:33.295955 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b1ff524_d9dc_4433_a21c_f6d00e3b89d4.slice/crio-23602e5fc91fac4aa1a6161a8bbde3765f1795ccc127448af06ae860cec9cec2 WatchSource:0}: Error finding container 23602e5fc91fac4aa1a6161a8bbde3765f1795ccc127448af06ae860cec9cec2: Status 404 returned error can't find the container with id 23602e5fc91fac4aa1a6161a8bbde3765f1795ccc127448af06ae860cec9cec2 Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.514525 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.552562 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-config-data-custom\") pod \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.552804 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hks9t\" (UniqueName: \"kubernetes.io/projected/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-kube-api-access-hks9t\") pod \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.552845 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-combined-ca-bundle\") pod \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.552991 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-logs\") pod \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.553019 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-config-data\") pod \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\" (UID: \"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90\") " Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.553850 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-logs" (OuterVolumeSpecName: "logs") pod "626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" (UID: "626cf8d9-6c74-4bd2-a62f-206bd9ed3c90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.558915 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-kube-api-access-hks9t" (OuterVolumeSpecName: "kube-api-access-hks9t") pod "626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" (UID: "626cf8d9-6c74-4bd2-a62f-206bd9ed3c90"). InnerVolumeSpecName "kube-api-access-hks9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.559444 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" (UID: "626cf8d9-6c74-4bd2-a62f-206bd9ed3c90"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.575505 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" (UID: "626cf8d9-6c74-4bd2-a62f-206bd9ed3c90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.592996 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-config-data" (OuterVolumeSpecName: "config-data") pod "626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" (UID: "626cf8d9-6c74-4bd2-a62f-206bd9ed3c90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.654444 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.654609 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.654694 4563 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.654749 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hks9t\" (UniqueName: \"kubernetes.io/projected/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-kube-api-access-hks9t\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.654795 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.937433 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9b1ff524-d9dc-4433-a21c-f6d00e3b89d4","Type":"ContainerStarted","Data":"23602e5fc91fac4aa1a6161a8bbde3765f1795ccc127448af06ae860cec9cec2"} Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.944105 4563 generic.go:334] "Generic (PLEG): container finished" podID="626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" containerID="8f017481fbaa8c5cb10be0c34acda2f08ea259d424d7ae7f802c7c4a2f91b14d" exitCode=0 Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.944142 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787479d464-54bqr" event={"ID":"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90","Type":"ContainerDied","Data":"8f017481fbaa8c5cb10be0c34acda2f08ea259d424d7ae7f802c7c4a2f91b14d"} Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.944164 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787479d464-54bqr" event={"ID":"626cf8d9-6c74-4bd2-a62f-206bd9ed3c90","Type":"ContainerDied","Data":"20bc5a5a4959c7d953ca940a0f63dd0b9dfb752b785a5ea47654e28d9e0404ff"} Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.944181 4563 scope.go:117] "RemoveContainer" containerID="8f017481fbaa8c5cb10be0c34acda2f08ea259d424d7ae7f802c7c4a2f91b14d" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.944306 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-787479d464-54bqr" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.979968 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-787479d464-54bqr"] Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.990692 4563 scope.go:117] "RemoveContainer" containerID="a0cf758c57b6cc78147c5f06280ea898ccd750ead778e95095d7f8d2a4da1b9b" Nov 24 09:19:33 crc kubenswrapper[4563]: I1124 09:19:33.993242 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-787479d464-54bqr"] Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.013395 4563 scope.go:117] "RemoveContainer" containerID="8f017481fbaa8c5cb10be0c34acda2f08ea259d424d7ae7f802c7c4a2f91b14d" Nov 24 09:19:34 crc kubenswrapper[4563]: E1124 09:19:34.014101 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f017481fbaa8c5cb10be0c34acda2f08ea259d424d7ae7f802c7c4a2f91b14d\": container with ID starting with 8f017481fbaa8c5cb10be0c34acda2f08ea259d424d7ae7f802c7c4a2f91b14d not found: ID does not exist" containerID="8f017481fbaa8c5cb10be0c34acda2f08ea259d424d7ae7f802c7c4a2f91b14d" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.014152 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f017481fbaa8c5cb10be0c34acda2f08ea259d424d7ae7f802c7c4a2f91b14d"} err="failed to get container status \"8f017481fbaa8c5cb10be0c34acda2f08ea259d424d7ae7f802c7c4a2f91b14d\": rpc error: code = NotFound desc = could not find container \"8f017481fbaa8c5cb10be0c34acda2f08ea259d424d7ae7f802c7c4a2f91b14d\": container with ID starting with 8f017481fbaa8c5cb10be0c34acda2f08ea259d424d7ae7f802c7c4a2f91b14d not found: ID does not exist" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.014175 4563 scope.go:117] "RemoveContainer" containerID="a0cf758c57b6cc78147c5f06280ea898ccd750ead778e95095d7f8d2a4da1b9b" Nov 24 09:19:34 crc kubenswrapper[4563]: E1124 09:19:34.015134 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0cf758c57b6cc78147c5f06280ea898ccd750ead778e95095d7f8d2a4da1b9b\": container with ID starting with a0cf758c57b6cc78147c5f06280ea898ccd750ead778e95095d7f8d2a4da1b9b not found: ID does not exist" containerID="a0cf758c57b6cc78147c5f06280ea898ccd750ead778e95095d7f8d2a4da1b9b" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.015188 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0cf758c57b6cc78147c5f06280ea898ccd750ead778e95095d7f8d2a4da1b9b"} err="failed to get container status \"a0cf758c57b6cc78147c5f06280ea898ccd750ead778e95095d7f8d2a4da1b9b\": rpc error: code = NotFound desc = could not find container \"a0cf758c57b6cc78147c5f06280ea898ccd750ead778e95095d7f8d2a4da1b9b\": container with ID starting with a0cf758c57b6cc78147c5f06280ea898ccd750ead778e95095d7f8d2a4da1b9b not found: ID does not exist" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.595809 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.648955 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7cd5c59c66-hrmf5" podUID="0ec5b651-57ef-414b-8c8e-4b488d71663f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.675114 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-httpd-config\") pod \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.675905 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-combined-ca-bundle\") pod \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.675942 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-config\") pod \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.676040 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-ovndb-tls-certs\") pod \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.676083 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2pkr\" (UniqueName: \"kubernetes.io/projected/80a0c11a-54b3-4823-86c1-8b48b37b46e4-kube-api-access-c2pkr\") pod \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\" (UID: \"80a0c11a-54b3-4823-86c1-8b48b37b46e4\") " Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.695982 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a0c11a-54b3-4823-86c1-8b48b37b46e4-kube-api-access-c2pkr" (OuterVolumeSpecName: "kube-api-access-c2pkr") pod "80a0c11a-54b3-4823-86c1-8b48b37b46e4" (UID: "80a0c11a-54b3-4823-86c1-8b48b37b46e4"). InnerVolumeSpecName "kube-api-access-c2pkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.696062 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "80a0c11a-54b3-4823-86c1-8b48b37b46e4" (UID: "80a0c11a-54b3-4823-86c1-8b48b37b46e4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.724414 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-config" (OuterVolumeSpecName: "config") pod "80a0c11a-54b3-4823-86c1-8b48b37b46e4" (UID: "80a0c11a-54b3-4823-86c1-8b48b37b46e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.752908 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80a0c11a-54b3-4823-86c1-8b48b37b46e4" (UID: "80a0c11a-54b3-4823-86c1-8b48b37b46e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.753254 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "80a0c11a-54b3-4823-86c1-8b48b37b46e4" (UID: "80a0c11a-54b3-4823-86c1-8b48b37b46e4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.778047 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2pkr\" (UniqueName: \"kubernetes.io/projected/80a0c11a-54b3-4823-86c1-8b48b37b46e4-kube-api-access-c2pkr\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.778076 4563 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.778087 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.778099 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.778108 4563 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/80a0c11a-54b3-4823-86c1-8b48b37b46e4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.961017 4563 generic.go:334] "Generic (PLEG): container finished" podID="80a0c11a-54b3-4823-86c1-8b48b37b46e4" containerID="6d3a5c2335736b9cafe2fc177ead532e75faac165ee72181e066a3aeeb0e1aff" exitCode=0 Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.961275 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffbb5d-hrkmt" event={"ID":"80a0c11a-54b3-4823-86c1-8b48b37b46e4","Type":"ContainerDied","Data":"6d3a5c2335736b9cafe2fc177ead532e75faac165ee72181e066a3aeeb0e1aff"} Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.961537 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ffbb5d-hrkmt" event={"ID":"80a0c11a-54b3-4823-86c1-8b48b37b46e4","Type":"ContainerDied","Data":"1b76a76243dd63ee93f1a4747485921334c914e6cef40e436da94ec9143bc4c2"} Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.961610 4563 scope.go:117] "RemoveContainer" containerID="f49b10e8d89be654f65a1d0c8844563f6bd522c3bef7627f789f59fafff60484" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.961372 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ffbb5d-hrkmt" Nov 24 09:19:34 crc kubenswrapper[4563]: I1124 09:19:34.999045 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6ffbb5d-hrkmt"] Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.000206 4563 scope.go:117] "RemoveContainer" containerID="6d3a5c2335736b9cafe2fc177ead532e75faac165ee72181e066a3aeeb0e1aff" Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.003957 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6ffbb5d-hrkmt"] Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.022254 4563 scope.go:117] "RemoveContainer" containerID="f49b10e8d89be654f65a1d0c8844563f6bd522c3bef7627f789f59fafff60484" Nov 24 09:19:35 crc kubenswrapper[4563]: E1124 09:19:35.022659 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f49b10e8d89be654f65a1d0c8844563f6bd522c3bef7627f789f59fafff60484\": container with ID starting with f49b10e8d89be654f65a1d0c8844563f6bd522c3bef7627f789f59fafff60484 not found: ID does not exist" containerID="f49b10e8d89be654f65a1d0c8844563f6bd522c3bef7627f789f59fafff60484" Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.022710 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f49b10e8d89be654f65a1d0c8844563f6bd522c3bef7627f789f59fafff60484"} err="failed to get container status \"f49b10e8d89be654f65a1d0c8844563f6bd522c3bef7627f789f59fafff60484\": rpc error: code = NotFound desc = could not find container \"f49b10e8d89be654f65a1d0c8844563f6bd522c3bef7627f789f59fafff60484\": container with ID starting with f49b10e8d89be654f65a1d0c8844563f6bd522c3bef7627f789f59fafff60484 not found: ID does not exist" Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.022737 4563 scope.go:117] "RemoveContainer" containerID="6d3a5c2335736b9cafe2fc177ead532e75faac165ee72181e066a3aeeb0e1aff" Nov 24 09:19:35 crc kubenswrapper[4563]: E1124 09:19:35.022961 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d3a5c2335736b9cafe2fc177ead532e75faac165ee72181e066a3aeeb0e1aff\": container with ID starting with 6d3a5c2335736b9cafe2fc177ead532e75faac165ee72181e066a3aeeb0e1aff not found: ID does not exist" containerID="6d3a5c2335736b9cafe2fc177ead532e75faac165ee72181e066a3aeeb0e1aff" Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.022989 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d3a5c2335736b9cafe2fc177ead532e75faac165ee72181e066a3aeeb0e1aff"} err="failed to get container status \"6d3a5c2335736b9cafe2fc177ead532e75faac165ee72181e066a3aeeb0e1aff\": rpc error: code = NotFound desc = could not find container \"6d3a5c2335736b9cafe2fc177ead532e75faac165ee72181e066a3aeeb0e1aff\": container with ID starting with 6d3a5c2335736b9cafe2fc177ead532e75faac165ee72181e066a3aeeb0e1aff not found: ID does not exist" Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.068114 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" path="/var/lib/kubelet/pods/626cf8d9-6c74-4bd2-a62f-206bd9ed3c90/volumes" Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.068718 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80a0c11a-54b3-4823-86c1-8b48b37b46e4" path="/var/lib/kubelet/pods/80a0c11a-54b3-4823-86c1-8b48b37b46e4/volumes" Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.681007 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.681945 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="126732ea-0f6a-4fd6-9b5b-959e4da904fe" containerName="glance-log" containerID="cri-o://1e2e3f0bbd509f6376d01716104cf491690003d78ec1df0ea20d767237402289" gracePeriod=30 Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.682021 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="126732ea-0f6a-4fd6-9b5b-959e4da904fe" containerName="glance-httpd" containerID="cri-o://123de7ce0128b04a34dc9c98f1bd7fba7f887c0a9e1265cd172ebac4b29c79f6" gracePeriod=30 Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.962648 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.963113 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="ceilometer-central-agent" containerID="cri-o://8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16" gracePeriod=30 Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.963193 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="proxy-httpd" containerID="cri-o://2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4" gracePeriod=30 Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.963354 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="sg-core" containerID="cri-o://ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc" gracePeriod=30 Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.963430 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="ceilometer-notification-agent" containerID="cri-o://ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50" gracePeriod=30 Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.991538 4563 generic.go:334] "Generic (PLEG): container finished" podID="126732ea-0f6a-4fd6-9b5b-959e4da904fe" containerID="1e2e3f0bbd509f6376d01716104cf491690003d78ec1df0ea20d767237402289" exitCode=143 Nov 24 09:19:35 crc kubenswrapper[4563]: I1124 09:19:35.991606 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"126732ea-0f6a-4fd6-9b5b-959e4da904fe","Type":"ContainerDied","Data":"1e2e3f0bbd509f6376d01716104cf491690003d78ec1df0ea20d767237402289"} Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.067309 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.171:3000/\": read tcp 10.217.0.2:44748->10.217.0.171:3000: read: connection reset by peer" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.617520 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.719254 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3effc21a-ac22-4712-ae88-2b318473ccee-run-httpd\") pod \"3effc21a-ac22-4712-ae88-2b318473ccee\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.719540 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wg2r\" (UniqueName: \"kubernetes.io/projected/3effc21a-ac22-4712-ae88-2b318473ccee-kube-api-access-7wg2r\") pod \"3effc21a-ac22-4712-ae88-2b318473ccee\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.720445 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3effc21a-ac22-4712-ae88-2b318473ccee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3effc21a-ac22-4712-ae88-2b318473ccee" (UID: "3effc21a-ac22-4712-ae88-2b318473ccee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.724827 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3effc21a-ac22-4712-ae88-2b318473ccee-kube-api-access-7wg2r" (OuterVolumeSpecName: "kube-api-access-7wg2r") pod "3effc21a-ac22-4712-ae88-2b318473ccee" (UID: "3effc21a-ac22-4712-ae88-2b318473ccee"). InnerVolumeSpecName "kube-api-access-7wg2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.821906 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-scripts\") pod \"3effc21a-ac22-4712-ae88-2b318473ccee\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.821996 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-combined-ca-bundle\") pod \"3effc21a-ac22-4712-ae88-2b318473ccee\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.822135 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3effc21a-ac22-4712-ae88-2b318473ccee-log-httpd\") pod \"3effc21a-ac22-4712-ae88-2b318473ccee\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.822194 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-config-data\") pod \"3effc21a-ac22-4712-ae88-2b318473ccee\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.822221 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-sg-core-conf-yaml\") pod \"3effc21a-ac22-4712-ae88-2b318473ccee\" (UID: \"3effc21a-ac22-4712-ae88-2b318473ccee\") " Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.822873 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3effc21a-ac22-4712-ae88-2b318473ccee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3effc21a-ac22-4712-ae88-2b318473ccee" (UID: "3effc21a-ac22-4712-ae88-2b318473ccee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.823191 4563 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3effc21a-ac22-4712-ae88-2b318473ccee-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.823210 4563 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3effc21a-ac22-4712-ae88-2b318473ccee-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.823224 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wg2r\" (UniqueName: \"kubernetes.io/projected/3effc21a-ac22-4712-ae88-2b318473ccee-kube-api-access-7wg2r\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.828194 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-scripts" (OuterVolumeSpecName: "scripts") pod "3effc21a-ac22-4712-ae88-2b318473ccee" (UID: "3effc21a-ac22-4712-ae88-2b318473ccee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.845958 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3effc21a-ac22-4712-ae88-2b318473ccee" (UID: "3effc21a-ac22-4712-ae88-2b318473ccee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.884495 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3effc21a-ac22-4712-ae88-2b318473ccee" (UID: "3effc21a-ac22-4712-ae88-2b318473ccee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.908540 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-config-data" (OuterVolumeSpecName: "config-data") pod "3effc21a-ac22-4712-ae88-2b318473ccee" (UID: "3effc21a-ac22-4712-ae88-2b318473ccee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.924898 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.924926 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.924939 4563 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:36 crc kubenswrapper[4563]: I1124 09:19:36.924950 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3effc21a-ac22-4712-ae88-2b318473ccee-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.003379 4563 generic.go:334] "Generic (PLEG): container finished" podID="3effc21a-ac22-4712-ae88-2b318473ccee" containerID="2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4" exitCode=0 Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.003415 4563 generic.go:334] "Generic (PLEG): container finished" podID="3effc21a-ac22-4712-ae88-2b318473ccee" containerID="ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc" exitCode=2 Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.003423 4563 generic.go:334] "Generic (PLEG): container finished" podID="3effc21a-ac22-4712-ae88-2b318473ccee" containerID="ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50" exitCode=0 Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.003431 4563 generic.go:334] "Generic (PLEG): container finished" podID="3effc21a-ac22-4712-ae88-2b318473ccee" containerID="8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16" exitCode=0 Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.003420 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3effc21a-ac22-4712-ae88-2b318473ccee","Type":"ContainerDied","Data":"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4"} Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.003852 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3effc21a-ac22-4712-ae88-2b318473ccee","Type":"ContainerDied","Data":"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc"} Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.003500 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.003884 4563 scope.go:117] "RemoveContainer" containerID="2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.003871 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3effc21a-ac22-4712-ae88-2b318473ccee","Type":"ContainerDied","Data":"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50"} Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.003974 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3effc21a-ac22-4712-ae88-2b318473ccee","Type":"ContainerDied","Data":"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16"} Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.003987 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3effc21a-ac22-4712-ae88-2b318473ccee","Type":"ContainerDied","Data":"b6fde3917e7613a43493a7393ea586cbe9c7a5b335b6ea18d785d6b8c490ce34"} Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.034876 4563 scope.go:117] "RemoveContainer" containerID="ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.046223 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.058794 4563 scope.go:117] "RemoveContainer" containerID="ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.086195 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.086245 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:37 crc kubenswrapper[4563]: E1124 09:19:37.086868 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a0c11a-54b3-4823-86c1-8b48b37b46e4" containerName="neutron-httpd" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.086885 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a0c11a-54b3-4823-86c1-8b48b37b46e4" containerName="neutron-httpd" Nov 24 09:19:37 crc kubenswrapper[4563]: E1124 09:19:37.086923 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a0c11a-54b3-4823-86c1-8b48b37b46e4" containerName="neutron-api" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.086930 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a0c11a-54b3-4823-86c1-8b48b37b46e4" containerName="neutron-api" Nov 24 09:19:37 crc kubenswrapper[4563]: E1124 09:19:37.086952 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="ceilometer-central-agent" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.086960 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="ceilometer-central-agent" Nov 24 09:19:37 crc kubenswrapper[4563]: E1124 09:19:37.086981 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="proxy-httpd" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.086988 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="proxy-httpd" Nov 24 09:19:37 crc kubenswrapper[4563]: E1124 09:19:37.087005 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="sg-core" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.087011 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="sg-core" Nov 24 09:19:37 crc kubenswrapper[4563]: E1124 09:19:37.087025 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" containerName="barbican-api-log" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.087033 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" containerName="barbican-api-log" Nov 24 09:19:37 crc kubenswrapper[4563]: E1124 09:19:37.087045 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="ceilometer-notification-agent" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.087051 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="ceilometer-notification-agent" Nov 24 09:19:37 crc kubenswrapper[4563]: E1124 09:19:37.087087 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" containerName="barbican-api" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.087093 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" containerName="barbican-api" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.087780 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="80a0c11a-54b3-4823-86c1-8b48b37b46e4" containerName="neutron-api" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.087805 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" containerName="barbican-api-log" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.087821 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="proxy-httpd" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.087832 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="626cf8d9-6c74-4bd2-a62f-206bd9ed3c90" containerName="barbican-api" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.087851 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="ceilometer-notification-agent" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.087863 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="ceilometer-central-agent" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.087881 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="80a0c11a-54b3-4823-86c1-8b48b37b46e4" containerName="neutron-httpd" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.087889 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" containerName="sg-core" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.091189 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.091315 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.094606 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.095168 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.116082 4563 scope.go:117] "RemoveContainer" containerID="8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.165524 4563 scope.go:117] "RemoveContainer" containerID="2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4" Nov 24 09:19:37 crc kubenswrapper[4563]: E1124 09:19:37.168799 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4\": container with ID starting with 2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4 not found: ID does not exist" containerID="2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.168840 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4"} err="failed to get container status \"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4\": rpc error: code = NotFound desc = could not find container \"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4\": container with ID starting with 2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4 not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.168865 4563 scope.go:117] "RemoveContainer" containerID="ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc" Nov 24 09:19:37 crc kubenswrapper[4563]: E1124 09:19:37.170894 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc\": container with ID starting with ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc not found: ID does not exist" containerID="ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.170927 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc"} err="failed to get container status \"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc\": rpc error: code = NotFound desc = could not find container \"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc\": container with ID starting with ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.170947 4563 scope.go:117] "RemoveContainer" containerID="ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50" Nov 24 09:19:37 crc kubenswrapper[4563]: E1124 09:19:37.171220 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50\": container with ID starting with ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50 not found: ID does not exist" containerID="ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.171243 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50"} err="failed to get container status \"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50\": rpc error: code = NotFound desc = could not find container \"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50\": container with ID starting with ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50 not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.171257 4563 scope.go:117] "RemoveContainer" containerID="8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16" Nov 24 09:19:37 crc kubenswrapper[4563]: E1124 09:19:37.171673 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16\": container with ID starting with 8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16 not found: ID does not exist" containerID="8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.171694 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16"} err="failed to get container status \"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16\": rpc error: code = NotFound desc = could not find container \"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16\": container with ID starting with 8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16 not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.171710 4563 scope.go:117] "RemoveContainer" containerID="2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.172064 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4"} err="failed to get container status \"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4\": rpc error: code = NotFound desc = could not find container \"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4\": container with ID starting with 2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4 not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.172120 4563 scope.go:117] "RemoveContainer" containerID="ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.173665 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc"} err="failed to get container status \"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc\": rpc error: code = NotFound desc = could not find container \"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc\": container with ID starting with ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.173691 4563 scope.go:117] "RemoveContainer" containerID="ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.173960 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50"} err="failed to get container status \"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50\": rpc error: code = NotFound desc = could not find container \"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50\": container with ID starting with ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50 not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.173978 4563 scope.go:117] "RemoveContainer" containerID="8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.174220 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16"} err="failed to get container status \"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16\": rpc error: code = NotFound desc = could not find container \"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16\": container with ID starting with 8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16 not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.174241 4563 scope.go:117] "RemoveContainer" containerID="2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.174455 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4"} err="failed to get container status \"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4\": rpc error: code = NotFound desc = could not find container \"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4\": container with ID starting with 2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4 not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.174477 4563 scope.go:117] "RemoveContainer" containerID="ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.174707 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc"} err="failed to get container status \"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc\": rpc error: code = NotFound desc = could not find container \"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc\": container with ID starting with ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.174726 4563 scope.go:117] "RemoveContainer" containerID="ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.174993 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50"} err="failed to get container status \"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50\": rpc error: code = NotFound desc = could not find container \"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50\": container with ID starting with ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50 not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.175013 4563 scope.go:117] "RemoveContainer" containerID="8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.175276 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16"} err="failed to get container status \"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16\": rpc error: code = NotFound desc = could not find container \"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16\": container with ID starting with 8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16 not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.175295 4563 scope.go:117] "RemoveContainer" containerID="2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.175477 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4"} err="failed to get container status \"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4\": rpc error: code = NotFound desc = could not find container \"2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4\": container with ID starting with 2f1439cf5cbdd509cc7b7d10b940c80bc6189bf88d13a62006c2ce69b4ca5bb4 not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.175497 4563 scope.go:117] "RemoveContainer" containerID="ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.175778 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc"} err="failed to get container status \"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc\": rpc error: code = NotFound desc = could not find container \"ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc\": container with ID starting with ae7947df36430f48fcac29fe112e2ee8ef75797b24d25dcef436317b2b068fcc not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.175799 4563 scope.go:117] "RemoveContainer" containerID="ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.176004 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50"} err="failed to get container status \"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50\": rpc error: code = NotFound desc = could not find container \"ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50\": container with ID starting with ff977e76067f33a031af7645864cacf72620d2c3aa9e68370edda18de51c0c50 not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.176027 4563 scope.go:117] "RemoveContainer" containerID="8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.176362 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16"} err="failed to get container status \"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16\": rpc error: code = NotFound desc = could not find container \"8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16\": container with ID starting with 8a98bcfef27db491edd8912b5ff42eed38497b13f09d69952becdc7fa0da4b16 not found: ID does not exist" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.230237 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-config-data\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.230332 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.230354 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc5mt\" (UniqueName: \"kubernetes.io/projected/aeeceac4-ca86-4f39-8871-15ef4052ff3d-kube-api-access-lc5mt\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.230390 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-scripts\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.230416 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.230438 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeeceac4-ca86-4f39-8871-15ef4052ff3d-run-httpd\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.230482 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeeceac4-ca86-4f39-8871-15ef4052ff3d-log-httpd\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.278199 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5487cdb76f-rn9rx"] Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.279812 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.282802 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.284242 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.284410 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.284509 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5487cdb76f-rn9rx"] Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.333254 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-config-data\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.333375 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64984138-1ff3-4d53-b4b9-e301fc5f2f80-internal-tls-certs\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.333454 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64984138-1ff3-4d53-b4b9-e301fc5f2f80-public-tls-certs\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.333555 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64984138-1ff3-4d53-b4b9-e301fc5f2f80-run-httpd\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.333592 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.333628 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc5mt\" (UniqueName: \"kubernetes.io/projected/aeeceac4-ca86-4f39-8871-15ef4052ff3d-kube-api-access-lc5mt\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.333965 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64984138-1ff3-4d53-b4b9-e301fc5f2f80-config-data\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.334074 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7s5f\" (UniqueName: \"kubernetes.io/projected/64984138-1ff3-4d53-b4b9-e301fc5f2f80-kube-api-access-x7s5f\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.334107 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-scripts\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.334171 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.334207 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeeceac4-ca86-4f39-8871-15ef4052ff3d-run-httpd\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.334235 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64984138-1ff3-4d53-b4b9-e301fc5f2f80-log-httpd\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.334256 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64984138-1ff3-4d53-b4b9-e301fc5f2f80-combined-ca-bundle\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.334278 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64984138-1ff3-4d53-b4b9-e301fc5f2f80-etc-swift\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.334300 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeeceac4-ca86-4f39-8871-15ef4052ff3d-log-httpd\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.334787 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeeceac4-ca86-4f39-8871-15ef4052ff3d-run-httpd\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.335009 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeeceac4-ca86-4f39-8871-15ef4052ff3d-log-httpd\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.338277 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.338390 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.339366 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-scripts\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.340744 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-config-data\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.347382 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc5mt\" (UniqueName: \"kubernetes.io/projected/aeeceac4-ca86-4f39-8871-15ef4052ff3d-kube-api-access-lc5mt\") pod \"ceilometer-0\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.412280 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.436464 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64984138-1ff3-4d53-b4b9-e301fc5f2f80-etc-swift\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.436850 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64984138-1ff3-4d53-b4b9-e301fc5f2f80-internal-tls-certs\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.436906 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64984138-1ff3-4d53-b4b9-e301fc5f2f80-public-tls-certs\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.437360 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64984138-1ff3-4d53-b4b9-e301fc5f2f80-run-httpd\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.437428 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64984138-1ff3-4d53-b4b9-e301fc5f2f80-config-data\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.437470 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7s5f\" (UniqueName: \"kubernetes.io/projected/64984138-1ff3-4d53-b4b9-e301fc5f2f80-kube-api-access-x7s5f\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.437909 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64984138-1ff3-4d53-b4b9-e301fc5f2f80-log-httpd\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.437937 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64984138-1ff3-4d53-b4b9-e301fc5f2f80-combined-ca-bundle\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.439317 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64984138-1ff3-4d53-b4b9-e301fc5f2f80-log-httpd\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.439402 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64984138-1ff3-4d53-b4b9-e301fc5f2f80-run-httpd\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.442329 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64984138-1ff3-4d53-b4b9-e301fc5f2f80-combined-ca-bundle\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.442918 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64984138-1ff3-4d53-b4b9-e301fc5f2f80-etc-swift\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.442965 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64984138-1ff3-4d53-b4b9-e301fc5f2f80-config-data\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.443323 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64984138-1ff3-4d53-b4b9-e301fc5f2f80-public-tls-certs\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.443479 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64984138-1ff3-4d53-b4b9-e301fc5f2f80-internal-tls-certs\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.453872 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7s5f\" (UniqueName: \"kubernetes.io/projected/64984138-1ff3-4d53-b4b9-e301fc5f2f80-kube-api-access-x7s5f\") pod \"swift-proxy-5487cdb76f-rn9rx\" (UID: \"64984138-1ff3-4d53-b4b9-e301fc5f2f80\") " pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.499265 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.531982 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.608663 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:37 crc kubenswrapper[4563]: I1124 09:19:37.869217 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:38 crc kubenswrapper[4563]: I1124 09:19:38.836519 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="126732ea-0f6a-4fd6-9b5b-959e4da904fe" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:34936->10.217.0.154:9292: read: connection reset by peer" Nov 24 09:19:38 crc kubenswrapper[4563]: I1124 09:19:38.836523 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="126732ea-0f6a-4fd6-9b5b-959e4da904fe" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:34942->10.217.0.154:9292: read: connection reset by peer" Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.023112 4563 generic.go:334] "Generic (PLEG): container finished" podID="126732ea-0f6a-4fd6-9b5b-959e4da904fe" containerID="123de7ce0128b04a34dc9c98f1bd7fba7f887c0a9e1265cd172ebac4b29c79f6" exitCode=0 Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.023150 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"126732ea-0f6a-4fd6-9b5b-959e4da904fe","Type":"ContainerDied","Data":"123de7ce0128b04a34dc9c98f1bd7fba7f887c0a9e1265cd172ebac4b29c79f6"} Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.068142 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3effc21a-ac22-4712-ae88-2b318473ccee" path="/var/lib/kubelet/pods/3effc21a-ac22-4712-ae88-2b318473ccee/volumes" Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.756105 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-zzpdk"] Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.757337 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zzpdk" Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.772959 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zzpdk"] Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.855444 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9nwp9"] Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.856664 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9nwp9" Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.866448 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9nwp9"] Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.884890 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54187c3a-449a-420f-a04c-f4e6c6f7fc3f-operator-scripts\") pod \"nova-api-db-create-zzpdk\" (UID: \"54187c3a-449a-420f-a04c-f4e6c6f7fc3f\") " pod="openstack/nova-api-db-create-zzpdk" Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.884931 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bns8\" (UniqueName: \"kubernetes.io/projected/54187c3a-449a-420f-a04c-f4e6c6f7fc3f-kube-api-access-9bns8\") pod \"nova-api-db-create-zzpdk\" (UID: \"54187c3a-449a-420f-a04c-f4e6c6f7fc3f\") " pod="openstack/nova-api-db-create-zzpdk" Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.970859 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0283-account-create-wklw4"] Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.973286 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0283-account-create-wklw4" Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.974705 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.980175 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0283-account-create-wklw4"] Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.999589 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd2c44a4-86c6-43b2-9f19-ab409f2eaded-operator-scripts\") pod \"nova-api-0283-account-create-wklw4\" (UID: \"fd2c44a4-86c6-43b2-9f19-ab409f2eaded\") " pod="openstack/nova-api-0283-account-create-wklw4" Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.999725 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48ch7\" (UniqueName: \"kubernetes.io/projected/71b426e4-d393-4260-b66f-28c288ce8e89-kube-api-access-48ch7\") pod \"nova-cell0-db-create-9nwp9\" (UID: \"71b426e4-d393-4260-b66f-28c288ce8e89\") " pod="openstack/nova-cell0-db-create-9nwp9" Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.999830 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgpmp\" (UniqueName: \"kubernetes.io/projected/fd2c44a4-86c6-43b2-9f19-ab409f2eaded-kube-api-access-zgpmp\") pod \"nova-api-0283-account-create-wklw4\" (UID: \"fd2c44a4-86c6-43b2-9f19-ab409f2eaded\") " pod="openstack/nova-api-0283-account-create-wklw4" Nov 24 09:19:39 crc kubenswrapper[4563]: I1124 09:19:39.999958 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54187c3a-449a-420f-a04c-f4e6c6f7fc3f-operator-scripts\") pod \"nova-api-db-create-zzpdk\" (UID: \"54187c3a-449a-420f-a04c-f4e6c6f7fc3f\") " pod="openstack/nova-api-db-create-zzpdk" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:39.999995 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bns8\" (UniqueName: \"kubernetes.io/projected/54187c3a-449a-420f-a04c-f4e6c6f7fc3f-kube-api-access-9bns8\") pod \"nova-api-db-create-zzpdk\" (UID: \"54187c3a-449a-420f-a04c-f4e6c6f7fc3f\") " pod="openstack/nova-api-db-create-zzpdk" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.000049 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b426e4-d393-4260-b66f-28c288ce8e89-operator-scripts\") pod \"nova-cell0-db-create-9nwp9\" (UID: \"71b426e4-d393-4260-b66f-28c288ce8e89\") " pod="openstack/nova-cell0-db-create-9nwp9" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.001184 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54187c3a-449a-420f-a04c-f4e6c6f7fc3f-operator-scripts\") pod \"nova-api-db-create-zzpdk\" (UID: \"54187c3a-449a-420f-a04c-f4e6c6f7fc3f\") " pod="openstack/nova-api-db-create-zzpdk" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.020280 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bns8\" (UniqueName: \"kubernetes.io/projected/54187c3a-449a-420f-a04c-f4e6c6f7fc3f-kube-api-access-9bns8\") pod \"nova-api-db-create-zzpdk\" (UID: \"54187c3a-449a-420f-a04c-f4e6c6f7fc3f\") " pod="openstack/nova-api-db-create-zzpdk" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.074190 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zzpdk" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.103074 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgpmp\" (UniqueName: \"kubernetes.io/projected/fd2c44a4-86c6-43b2-9f19-ab409f2eaded-kube-api-access-zgpmp\") pod \"nova-api-0283-account-create-wklw4\" (UID: \"fd2c44a4-86c6-43b2-9f19-ab409f2eaded\") " pod="openstack/nova-api-0283-account-create-wklw4" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.103163 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b426e4-d393-4260-b66f-28c288ce8e89-operator-scripts\") pod \"nova-cell0-db-create-9nwp9\" (UID: \"71b426e4-d393-4260-b66f-28c288ce8e89\") " pod="openstack/nova-cell0-db-create-9nwp9" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.103237 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd2c44a4-86c6-43b2-9f19-ab409f2eaded-operator-scripts\") pod \"nova-api-0283-account-create-wklw4\" (UID: \"fd2c44a4-86c6-43b2-9f19-ab409f2eaded\") " pod="openstack/nova-api-0283-account-create-wklw4" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.103279 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48ch7\" (UniqueName: \"kubernetes.io/projected/71b426e4-d393-4260-b66f-28c288ce8e89-kube-api-access-48ch7\") pod \"nova-cell0-db-create-9nwp9\" (UID: \"71b426e4-d393-4260-b66f-28c288ce8e89\") " pod="openstack/nova-cell0-db-create-9nwp9" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.104257 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b426e4-d393-4260-b66f-28c288ce8e89-operator-scripts\") pod \"nova-cell0-db-create-9nwp9\" (UID: \"71b426e4-d393-4260-b66f-28c288ce8e89\") " pod="openstack/nova-cell0-db-create-9nwp9" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.104331 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd2c44a4-86c6-43b2-9f19-ab409f2eaded-operator-scripts\") pod \"nova-api-0283-account-create-wklw4\" (UID: \"fd2c44a4-86c6-43b2-9f19-ab409f2eaded\") " pod="openstack/nova-api-0283-account-create-wklw4" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.118942 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48ch7\" (UniqueName: \"kubernetes.io/projected/71b426e4-d393-4260-b66f-28c288ce8e89-kube-api-access-48ch7\") pod \"nova-cell0-db-create-9nwp9\" (UID: \"71b426e4-d393-4260-b66f-28c288ce8e89\") " pod="openstack/nova-cell0-db-create-9nwp9" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.119253 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgpmp\" (UniqueName: \"kubernetes.io/projected/fd2c44a4-86c6-43b2-9f19-ab409f2eaded-kube-api-access-zgpmp\") pod \"nova-api-0283-account-create-wklw4\" (UID: \"fd2c44a4-86c6-43b2-9f19-ab409f2eaded\") " pod="openstack/nova-api-0283-account-create-wklw4" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.166334 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-5dqsr"] Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.167776 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5dqsr" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.174915 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9nwp9" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.179758 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-04d9-account-create-dm4rx"] Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.181029 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-04d9-account-create-dm4rx" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.184778 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.195768 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5dqsr"] Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.204984 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cdd6b81-e677-43b6-a627-ac55f41bb1de-operator-scripts\") pod \"nova-cell1-db-create-5dqsr\" (UID: \"2cdd6b81-e677-43b6-a627-ac55f41bb1de\") " pod="openstack/nova-cell1-db-create-5dqsr" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.205055 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb7dz\" (UniqueName: \"kubernetes.io/projected/f0ee4dc6-69fd-4151-9e35-68fde68c500e-kube-api-access-cb7dz\") pod \"nova-cell0-04d9-account-create-dm4rx\" (UID: \"f0ee4dc6-69fd-4151-9e35-68fde68c500e\") " pod="openstack/nova-cell0-04d9-account-create-dm4rx" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.205076 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdjfm\" (UniqueName: \"kubernetes.io/projected/2cdd6b81-e677-43b6-a627-ac55f41bb1de-kube-api-access-wdjfm\") pod \"nova-cell1-db-create-5dqsr\" (UID: \"2cdd6b81-e677-43b6-a627-ac55f41bb1de\") " pod="openstack/nova-cell1-db-create-5dqsr" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.205118 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0ee4dc6-69fd-4151-9e35-68fde68c500e-operator-scripts\") pod \"nova-cell0-04d9-account-create-dm4rx\" (UID: \"f0ee4dc6-69fd-4151-9e35-68fde68c500e\") " pod="openstack/nova-cell0-04d9-account-create-dm4rx" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.215384 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-04d9-account-create-dm4rx"] Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.267962 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9a24-account-create-gdlvv"] Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.269321 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9a24-account-create-gdlvv" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.271033 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.275764 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9a24-account-create-gdlvv"] Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.298196 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0283-account-create-wklw4" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.306965 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b6f993-c4ff-4ea9-9306-5d30c09f0f8c-operator-scripts\") pod \"nova-cell1-9a24-account-create-gdlvv\" (UID: \"71b6f993-c4ff-4ea9-9306-5d30c09f0f8c\") " pod="openstack/nova-cell1-9a24-account-create-gdlvv" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.307101 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cdd6b81-e677-43b6-a627-ac55f41bb1de-operator-scripts\") pod \"nova-cell1-db-create-5dqsr\" (UID: \"2cdd6b81-e677-43b6-a627-ac55f41bb1de\") " pod="openstack/nova-cell1-db-create-5dqsr" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.307144 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2pdw\" (UniqueName: \"kubernetes.io/projected/71b6f993-c4ff-4ea9-9306-5d30c09f0f8c-kube-api-access-t2pdw\") pod \"nova-cell1-9a24-account-create-gdlvv\" (UID: \"71b6f993-c4ff-4ea9-9306-5d30c09f0f8c\") " pod="openstack/nova-cell1-9a24-account-create-gdlvv" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.307207 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb7dz\" (UniqueName: \"kubernetes.io/projected/f0ee4dc6-69fd-4151-9e35-68fde68c500e-kube-api-access-cb7dz\") pod \"nova-cell0-04d9-account-create-dm4rx\" (UID: \"f0ee4dc6-69fd-4151-9e35-68fde68c500e\") " pod="openstack/nova-cell0-04d9-account-create-dm4rx" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.307239 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdjfm\" (UniqueName: \"kubernetes.io/projected/2cdd6b81-e677-43b6-a627-ac55f41bb1de-kube-api-access-wdjfm\") pod \"nova-cell1-db-create-5dqsr\" (UID: \"2cdd6b81-e677-43b6-a627-ac55f41bb1de\") " pod="openstack/nova-cell1-db-create-5dqsr" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.307291 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0ee4dc6-69fd-4151-9e35-68fde68c500e-operator-scripts\") pod \"nova-cell0-04d9-account-create-dm4rx\" (UID: \"f0ee4dc6-69fd-4151-9e35-68fde68c500e\") " pod="openstack/nova-cell0-04d9-account-create-dm4rx" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.307970 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0ee4dc6-69fd-4151-9e35-68fde68c500e-operator-scripts\") pod \"nova-cell0-04d9-account-create-dm4rx\" (UID: \"f0ee4dc6-69fd-4151-9e35-68fde68c500e\") " pod="openstack/nova-cell0-04d9-account-create-dm4rx" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.309321 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cdd6b81-e677-43b6-a627-ac55f41bb1de-operator-scripts\") pod \"nova-cell1-db-create-5dqsr\" (UID: \"2cdd6b81-e677-43b6-a627-ac55f41bb1de\") " pod="openstack/nova-cell1-db-create-5dqsr" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.323337 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdjfm\" (UniqueName: \"kubernetes.io/projected/2cdd6b81-e677-43b6-a627-ac55f41bb1de-kube-api-access-wdjfm\") pod \"nova-cell1-db-create-5dqsr\" (UID: \"2cdd6b81-e677-43b6-a627-ac55f41bb1de\") " pod="openstack/nova-cell1-db-create-5dqsr" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.324301 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb7dz\" (UniqueName: \"kubernetes.io/projected/f0ee4dc6-69fd-4151-9e35-68fde68c500e-kube-api-access-cb7dz\") pod \"nova-cell0-04d9-account-create-dm4rx\" (UID: \"f0ee4dc6-69fd-4151-9e35-68fde68c500e\") " pod="openstack/nova-cell0-04d9-account-create-dm4rx" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.409610 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2pdw\" (UniqueName: \"kubernetes.io/projected/71b6f993-c4ff-4ea9-9306-5d30c09f0f8c-kube-api-access-t2pdw\") pod \"nova-cell1-9a24-account-create-gdlvv\" (UID: \"71b6f993-c4ff-4ea9-9306-5d30c09f0f8c\") " pod="openstack/nova-cell1-9a24-account-create-gdlvv" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.409898 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b6f993-c4ff-4ea9-9306-5d30c09f0f8c-operator-scripts\") pod \"nova-cell1-9a24-account-create-gdlvv\" (UID: \"71b6f993-c4ff-4ea9-9306-5d30c09f0f8c\") " pod="openstack/nova-cell1-9a24-account-create-gdlvv" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.410555 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b6f993-c4ff-4ea9-9306-5d30c09f0f8c-operator-scripts\") pod \"nova-cell1-9a24-account-create-gdlvv\" (UID: \"71b6f993-c4ff-4ea9-9306-5d30c09f0f8c\") " pod="openstack/nova-cell1-9a24-account-create-gdlvv" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.424293 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2pdw\" (UniqueName: \"kubernetes.io/projected/71b6f993-c4ff-4ea9-9306-5d30c09f0f8c-kube-api-access-t2pdw\") pod \"nova-cell1-9a24-account-create-gdlvv\" (UID: \"71b6f993-c4ff-4ea9-9306-5d30c09f0f8c\") " pod="openstack/nova-cell1-9a24-account-create-gdlvv" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.503121 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5dqsr" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.514024 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-04d9-account-create-dm4rx" Nov 24 09:19:40 crc kubenswrapper[4563]: I1124 09:19:40.582305 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9a24-account-create-gdlvv" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.838940 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.876525 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.876749 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126732ea-0f6a-4fd6-9b5b-959e4da904fe-logs\") pod \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.876929 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-internal-tls-certs\") pod \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.877058 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126732ea-0f6a-4fd6-9b5b-959e4da904fe-httpd-run\") pod \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.877134 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-combined-ca-bundle\") pod \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.877196 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-scripts\") pod \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.877284 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wblvn\" (UniqueName: \"kubernetes.io/projected/126732ea-0f6a-4fd6-9b5b-959e4da904fe-kube-api-access-wblvn\") pod \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.877360 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-config-data\") pod \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\" (UID: \"126732ea-0f6a-4fd6-9b5b-959e4da904fe\") " Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.877596 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/126732ea-0f6a-4fd6-9b5b-959e4da904fe-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "126732ea-0f6a-4fd6-9b5b-959e4da904fe" (UID: "126732ea-0f6a-4fd6-9b5b-959e4da904fe"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.877960 4563 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/126732ea-0f6a-4fd6-9b5b-959e4da904fe-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.880584 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/126732ea-0f6a-4fd6-9b5b-959e4da904fe-logs" (OuterVolumeSpecName: "logs") pod "126732ea-0f6a-4fd6-9b5b-959e4da904fe" (UID: "126732ea-0f6a-4fd6-9b5b-959e4da904fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.882843 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "126732ea-0f6a-4fd6-9b5b-959e4da904fe" (UID: "126732ea-0f6a-4fd6-9b5b-959e4da904fe"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.885499 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/126732ea-0f6a-4fd6-9b5b-959e4da904fe-kube-api-access-wblvn" (OuterVolumeSpecName: "kube-api-access-wblvn") pod "126732ea-0f6a-4fd6-9b5b-959e4da904fe" (UID: "126732ea-0f6a-4fd6-9b5b-959e4da904fe"). InnerVolumeSpecName "kube-api-access-wblvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.882843 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-scripts" (OuterVolumeSpecName: "scripts") pod "126732ea-0f6a-4fd6-9b5b-959e4da904fe" (UID: "126732ea-0f6a-4fd6-9b5b-959e4da904fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.902179 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "126732ea-0f6a-4fd6-9b5b-959e4da904fe" (UID: "126732ea-0f6a-4fd6-9b5b-959e4da904fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.919810 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-config-data" (OuterVolumeSpecName: "config-data") pod "126732ea-0f6a-4fd6-9b5b-959e4da904fe" (UID: "126732ea-0f6a-4fd6-9b5b-959e4da904fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.923877 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "126732ea-0f6a-4fd6-9b5b-959e4da904fe" (UID: "126732ea-0f6a-4fd6-9b5b-959e4da904fe"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.979668 4563 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.979763 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.979818 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.979867 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wblvn\" (UniqueName: \"kubernetes.io/projected/126732ea-0f6a-4fd6-9b5b-959e4da904fe-kube-api-access-wblvn\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.979927 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126732ea-0f6a-4fd6-9b5b-959e4da904fe-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.980009 4563 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 24 09:19:42 crc kubenswrapper[4563]: I1124 09:19:42.980060 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/126732ea-0f6a-4fd6-9b5b-959e4da904fe-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.019289 4563 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.100219 4563 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.128136 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aeeceac4-ca86-4f39-8871-15ef4052ff3d","Type":"ContainerStarted","Data":"5076f4e1d49456f48c699e6def7a248cc086b3542f7e39f95d992d87965205aa"} Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.132118 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9b1ff524-d9dc-4433-a21c-f6d00e3b89d4","Type":"ContainerStarted","Data":"25b13f437e7b7e23cf49aff4d374571521dbf3f5afde68d412fe456e329a81cf"} Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.135174 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"126732ea-0f6a-4fd6-9b5b-959e4da904fe","Type":"ContainerDied","Data":"95f1baf97eab6396d024ca842b6f3774fd17f77904ecf3cc6df056faf116c8d3"} Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.135218 4563 scope.go:117] "RemoveContainer" containerID="123de7ce0128b04a34dc9c98f1bd7fba7f887c0a9e1265cd172ebac4b29c79f6" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.135250 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.149983 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.041297965 podStartE2EDuration="11.149966414s" podCreationTimestamp="2025-11-24 09:19:32 +0000 UTC" firstStartedPulling="2025-11-24 09:19:33.298155385 +0000 UTC m=+950.557132832" lastFinishedPulling="2025-11-24 09:19:42.406823834 +0000 UTC m=+959.665801281" observedRunningTime="2025-11-24 09:19:43.143936982 +0000 UTC m=+960.402914429" watchObservedRunningTime="2025-11-24 09:19:43.149966414 +0000 UTC m=+960.408943861" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.163699 4563 scope.go:117] "RemoveContainer" containerID="1e2e3f0bbd509f6376d01716104cf491690003d78ec1df0ea20d767237402289" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.190656 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.219323 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.238883 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:19:43 crc kubenswrapper[4563]: E1124 09:19:43.239310 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126732ea-0f6a-4fd6-9b5b-959e4da904fe" containerName="glance-log" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.239327 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="126732ea-0f6a-4fd6-9b5b-959e4da904fe" containerName="glance-log" Nov 24 09:19:43 crc kubenswrapper[4563]: E1124 09:19:43.239342 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126732ea-0f6a-4fd6-9b5b-959e4da904fe" containerName="glance-httpd" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.239348 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="126732ea-0f6a-4fd6-9b5b-959e4da904fe" containerName="glance-httpd" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.239775 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="126732ea-0f6a-4fd6-9b5b-959e4da904fe" containerName="glance-httpd" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.239824 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="126732ea-0f6a-4fd6-9b5b-959e4da904fe" containerName="glance-log" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.241024 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.243783 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.246247 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.246317 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.312967 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0283-account-create-wklw4"] Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.323545 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-04d9-account-create-dm4rx"] Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.324055 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.329087 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5dqsr"] Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.335667 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zzpdk"] Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.354350 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9nwp9"] Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.370548 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9a24-account-create-gdlvv"] Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.380287 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.406976 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57487f8-d1f8-4f97-b92e-7385ecc88074-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.407010 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b57487f8-d1f8-4f97-b92e-7385ecc88074-logs\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.407052 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b57487f8-d1f8-4f97-b92e-7385ecc88074-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.407071 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b57487f8-d1f8-4f97-b92e-7385ecc88074-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.407119 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57487f8-d1f8-4f97-b92e-7385ecc88074-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.407156 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57487f8-d1f8-4f97-b92e-7385ecc88074-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.407185 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.407207 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tthvg\" (UniqueName: \"kubernetes.io/projected/b57487f8-d1f8-4f97-b92e-7385ecc88074-kube-api-access-tthvg\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.426891 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.470875 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5487cdb76f-rn9rx"] Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.509049 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b57487f8-d1f8-4f97-b92e-7385ecc88074-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.509089 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b57487f8-d1f8-4f97-b92e-7385ecc88074-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.509151 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57487f8-d1f8-4f97-b92e-7385ecc88074-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.509193 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57487f8-d1f8-4f97-b92e-7385ecc88074-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.509229 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.509252 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tthvg\" (UniqueName: \"kubernetes.io/projected/b57487f8-d1f8-4f97-b92e-7385ecc88074-kube-api-access-tthvg\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.509294 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57487f8-d1f8-4f97-b92e-7385ecc88074-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.509307 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b57487f8-d1f8-4f97-b92e-7385ecc88074-logs\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.509700 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b57487f8-d1f8-4f97-b92e-7385ecc88074-logs\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.510168 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b57487f8-d1f8-4f97-b92e-7385ecc88074-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.510616 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.515293 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57487f8-d1f8-4f97-b92e-7385ecc88074-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.526829 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57487f8-d1f8-4f97-b92e-7385ecc88074-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.531128 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b57487f8-d1f8-4f97-b92e-7385ecc88074-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.533144 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tthvg\" (UniqueName: \"kubernetes.io/projected/b57487f8-d1f8-4f97-b92e-7385ecc88074-kube-api-access-tthvg\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.535160 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57487f8-d1f8-4f97-b92e-7385ecc88074-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.554367 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"b57487f8-d1f8-4f97-b92e-7385ecc88074\") " pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.574785 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.808078 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.935342 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.935610 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ce3468f6-f565-41d5-ad15-302e10230479" containerName="glance-log" containerID="cri-o://e4972f237565195a76103efe1025e783d7e0d1366c6f017de9154a12791baa96" gracePeriod=30 Nov 24 09:19:43 crc kubenswrapper[4563]: I1124 09:19:43.936098 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ce3468f6-f565-41d5-ad15-302e10230479" containerName="glance-httpd" containerID="cri-o://3304aad3aaefb9843dcbd7d53d407c72fbd21eb60a457f9eb5ea04521218050e" gracePeriod=30 Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.013924 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-748c4bdffd-w974j" Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.159427 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5487cdb76f-rn9rx" event={"ID":"64984138-1ff3-4d53-b4b9-e301fc5f2f80","Type":"ContainerStarted","Data":"6cc5fdcdcd593ae8e91ce173e5eae71cba677bd7195c891ed0b14c31f4d07045"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.160094 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5487cdb76f-rn9rx" event={"ID":"64984138-1ff3-4d53-b4b9-e301fc5f2f80","Type":"ContainerStarted","Data":"0cab2c3dcd446954ac3e7699e16085597f90ac1eec56fa815b83e104a392c53d"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.163445 4563 generic.go:334] "Generic (PLEG): container finished" podID="fd2c44a4-86c6-43b2-9f19-ab409f2eaded" containerID="6f2af6eac37517ee8e9aee60a7627d32c3557a461b9321cebbc2c5326f4e2fa2" exitCode=0 Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.163839 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0283-account-create-wklw4" event={"ID":"fd2c44a4-86c6-43b2-9f19-ab409f2eaded","Type":"ContainerDied","Data":"6f2af6eac37517ee8e9aee60a7627d32c3557a461b9321cebbc2c5326f4e2fa2"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.163863 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0283-account-create-wklw4" event={"ID":"fd2c44a4-86c6-43b2-9f19-ab409f2eaded","Type":"ContainerStarted","Data":"7ae679599bc794c7c7adbf8bfa5a2c583b5c98caf19f76a5f2f603222487061b"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.166374 4563 generic.go:334] "Generic (PLEG): container finished" podID="71b6f993-c4ff-4ea9-9306-5d30c09f0f8c" containerID="c95204df4565a0ad14fee7b6fccca9aea380e3d54a7217bffee4ab5f68755207" exitCode=0 Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.166471 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9a24-account-create-gdlvv" event={"ID":"71b6f993-c4ff-4ea9-9306-5d30c09f0f8c","Type":"ContainerDied","Data":"c95204df4565a0ad14fee7b6fccca9aea380e3d54a7217bffee4ab5f68755207"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.166527 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9a24-account-create-gdlvv" event={"ID":"71b6f993-c4ff-4ea9-9306-5d30c09f0f8c","Type":"ContainerStarted","Data":"8b6990e933da2cce61a802d5675f2789a7feac86269489dff9730ab9da5e6da3"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.168029 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aeeceac4-ca86-4f39-8871-15ef4052ff3d","Type":"ContainerStarted","Data":"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.169335 4563 generic.go:334] "Generic (PLEG): container finished" podID="71b426e4-d393-4260-b66f-28c288ce8e89" containerID="cb121a91c0552ff81f840885d3fb56bbadff92e9ebc7c462dda9fab6858b5336" exitCode=0 Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.169372 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9nwp9" event={"ID":"71b426e4-d393-4260-b66f-28c288ce8e89","Type":"ContainerDied","Data":"cb121a91c0552ff81f840885d3fb56bbadff92e9ebc7c462dda9fab6858b5336"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.169398 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9nwp9" event={"ID":"71b426e4-d393-4260-b66f-28c288ce8e89","Type":"ContainerStarted","Data":"ddc5944ed6a99ebd2242563f21f9e4d2b349560cd808f37819fd0170fae75429"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.176725 4563 generic.go:334] "Generic (PLEG): container finished" podID="2cdd6b81-e677-43b6-a627-ac55f41bb1de" containerID="fdf38bf1b96efef1bc076e890ef650066b9626aa1431bc638ade4e93c4df030b" exitCode=0 Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.176774 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5dqsr" event={"ID":"2cdd6b81-e677-43b6-a627-ac55f41bb1de","Type":"ContainerDied","Data":"fdf38bf1b96efef1bc076e890ef650066b9626aa1431bc638ade4e93c4df030b"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.176791 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5dqsr" event={"ID":"2cdd6b81-e677-43b6-a627-ac55f41bb1de","Type":"ContainerStarted","Data":"02e11f7ed9f8405e3cf10c111af794a64a16d9a10e8e0650eeb56c30adb36d31"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.181450 4563 generic.go:334] "Generic (PLEG): container finished" podID="f0ee4dc6-69fd-4151-9e35-68fde68c500e" containerID="7bd613c1e631f06f8c99dc293308cdd878b574255f51170c4e79e7db1a5cba3f" exitCode=0 Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.181538 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-04d9-account-create-dm4rx" event={"ID":"f0ee4dc6-69fd-4151-9e35-68fde68c500e","Type":"ContainerDied","Data":"7bd613c1e631f06f8c99dc293308cdd878b574255f51170c4e79e7db1a5cba3f"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.181573 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-04d9-account-create-dm4rx" event={"ID":"f0ee4dc6-69fd-4151-9e35-68fde68c500e","Type":"ContainerStarted","Data":"d955a94b14b1bf96e9b2ef4fe719ec77e0a9573100378a905d972261f4f2df70"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.183323 4563 generic.go:334] "Generic (PLEG): container finished" podID="ce3468f6-f565-41d5-ad15-302e10230479" containerID="e4972f237565195a76103efe1025e783d7e0d1366c6f017de9154a12791baa96" exitCode=143 Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.183374 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce3468f6-f565-41d5-ad15-302e10230479","Type":"ContainerDied","Data":"e4972f237565195a76103efe1025e783d7e0d1366c6f017de9154a12791baa96"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.197814 4563 generic.go:334] "Generic (PLEG): container finished" podID="54187c3a-449a-420f-a04c-f4e6c6f7fc3f" containerID="3c26adbc8059ff965ca66c7fa73df912c4211f8febaa28943e3cffd7423ac3c5" exitCode=0 Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.197939 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zzpdk" event={"ID":"54187c3a-449a-420f-a04c-f4e6c6f7fc3f","Type":"ContainerDied","Data":"3c26adbc8059ff965ca66c7fa73df912c4211f8febaa28943e3cffd7423ac3c5"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.197968 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zzpdk" event={"ID":"54187c3a-449a-420f-a04c-f4e6c6f7fc3f","Type":"ContainerStarted","Data":"802ecabd501e2bfc5a5e3242a2eb4a0dd4dddb84047c0bb5cef90e60a0ac845b"} Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.208023 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.649525 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7cd5c59c66-hrmf5" podUID="0ec5b651-57ef-414b-8c8e-4b488d71663f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Nov 24 09:19:44 crc kubenswrapper[4563]: I1124 09:19:44.649674 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.074009 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="126732ea-0f6a-4fd6-9b5b-959e4da904fe" path="/var/lib/kubelet/pods/126732ea-0f6a-4fd6-9b5b-959e4da904fe/volumes" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.207962 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b57487f8-d1f8-4f97-b92e-7385ecc88074","Type":"ContainerStarted","Data":"e80706c241cd56865b3db1463978d3f23435fbff8e48f8100df1b41bec77a7ce"} Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.208014 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b57487f8-d1f8-4f97-b92e-7385ecc88074","Type":"ContainerStarted","Data":"e8555f765ceb56f1ff5ce20b1a9cd092f80dac8b3b81fea6fa7d389a05dd3131"} Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.210822 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aeeceac4-ca86-4f39-8871-15ef4052ff3d","Type":"ContainerStarted","Data":"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47"} Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.210870 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aeeceac4-ca86-4f39-8871-15ef4052ff3d","Type":"ContainerStarted","Data":"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca"} Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.212795 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5487cdb76f-rn9rx" event={"ID":"64984138-1ff3-4d53-b4b9-e301fc5f2f80","Type":"ContainerStarted","Data":"75b38c44f98743b3401e7832f077cded57ac70da0173d10cfc87a27101c8412a"} Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.213134 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.703840 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5dqsr" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.731223 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5487cdb76f-rn9rx" podStartSLOduration=8.731209509 podStartE2EDuration="8.731209509s" podCreationTimestamp="2025-11-24 09:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:19:45.234957125 +0000 UTC m=+962.493934572" watchObservedRunningTime="2025-11-24 09:19:45.731209509 +0000 UTC m=+962.990186956" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.770462 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdjfm\" (UniqueName: \"kubernetes.io/projected/2cdd6b81-e677-43b6-a627-ac55f41bb1de-kube-api-access-wdjfm\") pod \"2cdd6b81-e677-43b6-a627-ac55f41bb1de\" (UID: \"2cdd6b81-e677-43b6-a627-ac55f41bb1de\") " Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.770619 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cdd6b81-e677-43b6-a627-ac55f41bb1de-operator-scripts\") pod \"2cdd6b81-e677-43b6-a627-ac55f41bb1de\" (UID: \"2cdd6b81-e677-43b6-a627-ac55f41bb1de\") " Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.772314 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cdd6b81-e677-43b6-a627-ac55f41bb1de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2cdd6b81-e677-43b6-a627-ac55f41bb1de" (UID: "2cdd6b81-e677-43b6-a627-ac55f41bb1de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.781056 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cdd6b81-e677-43b6-a627-ac55f41bb1de-kube-api-access-wdjfm" (OuterVolumeSpecName: "kube-api-access-wdjfm") pod "2cdd6b81-e677-43b6-a627-ac55f41bb1de" (UID: "2cdd6b81-e677-43b6-a627-ac55f41bb1de"). InnerVolumeSpecName "kube-api-access-wdjfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.825339 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-04d9-account-create-dm4rx" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.832157 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0283-account-create-wklw4" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.838415 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9a24-account-create-gdlvv" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.846175 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9nwp9" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.848953 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zzpdk" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.874718 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48ch7\" (UniqueName: \"kubernetes.io/projected/71b426e4-d393-4260-b66f-28c288ce8e89-kube-api-access-48ch7\") pod \"71b426e4-d393-4260-b66f-28c288ce8e89\" (UID: \"71b426e4-d393-4260-b66f-28c288ce8e89\") " Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.874764 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b426e4-d393-4260-b66f-28c288ce8e89-operator-scripts\") pod \"71b426e4-d393-4260-b66f-28c288ce8e89\" (UID: \"71b426e4-d393-4260-b66f-28c288ce8e89\") " Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.876386 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b426e4-d393-4260-b66f-28c288ce8e89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71b426e4-d393-4260-b66f-28c288ce8e89" (UID: "71b426e4-d393-4260-b66f-28c288ce8e89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.876489 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b6f993-c4ff-4ea9-9306-5d30c09f0f8c-operator-scripts\") pod \"71b6f993-c4ff-4ea9-9306-5d30c09f0f8c\" (UID: \"71b6f993-c4ff-4ea9-9306-5d30c09f0f8c\") " Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.876566 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgpmp\" (UniqueName: \"kubernetes.io/projected/fd2c44a4-86c6-43b2-9f19-ab409f2eaded-kube-api-access-zgpmp\") pod \"fd2c44a4-86c6-43b2-9f19-ab409f2eaded\" (UID: \"fd2c44a4-86c6-43b2-9f19-ab409f2eaded\") " Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.876625 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2pdw\" (UniqueName: \"kubernetes.io/projected/71b6f993-c4ff-4ea9-9306-5d30c09f0f8c-kube-api-access-t2pdw\") pod \"71b6f993-c4ff-4ea9-9306-5d30c09f0f8c\" (UID: \"71b6f993-c4ff-4ea9-9306-5d30c09f0f8c\") " Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.876667 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0ee4dc6-69fd-4151-9e35-68fde68c500e-operator-scripts\") pod \"f0ee4dc6-69fd-4151-9e35-68fde68c500e\" (UID: \"f0ee4dc6-69fd-4151-9e35-68fde68c500e\") " Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.876744 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd2c44a4-86c6-43b2-9f19-ab409f2eaded-operator-scripts\") pod \"fd2c44a4-86c6-43b2-9f19-ab409f2eaded\" (UID: \"fd2c44a4-86c6-43b2-9f19-ab409f2eaded\") " Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.876820 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb7dz\" (UniqueName: \"kubernetes.io/projected/f0ee4dc6-69fd-4151-9e35-68fde68c500e-kube-api-access-cb7dz\") pod \"f0ee4dc6-69fd-4151-9e35-68fde68c500e\" (UID: \"f0ee4dc6-69fd-4151-9e35-68fde68c500e\") " Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.876851 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bns8\" (UniqueName: \"kubernetes.io/projected/54187c3a-449a-420f-a04c-f4e6c6f7fc3f-kube-api-access-9bns8\") pod \"54187c3a-449a-420f-a04c-f4e6c6f7fc3f\" (UID: \"54187c3a-449a-420f-a04c-f4e6c6f7fc3f\") " Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.876936 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54187c3a-449a-420f-a04c-f4e6c6f7fc3f-operator-scripts\") pod \"54187c3a-449a-420f-a04c-f4e6c6f7fc3f\" (UID: \"54187c3a-449a-420f-a04c-f4e6c6f7fc3f\") " Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.878038 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b426e4-d393-4260-b66f-28c288ce8e89-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.878057 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdjfm\" (UniqueName: \"kubernetes.io/projected/2cdd6b81-e677-43b6-a627-ac55f41bb1de-kube-api-access-wdjfm\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.878068 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cdd6b81-e677-43b6-a627-ac55f41bb1de-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.878489 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54187c3a-449a-420f-a04c-f4e6c6f7fc3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54187c3a-449a-420f-a04c-f4e6c6f7fc3f" (UID: "54187c3a-449a-420f-a04c-f4e6c6f7fc3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.879064 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd2c44a4-86c6-43b2-9f19-ab409f2eaded-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd2c44a4-86c6-43b2-9f19-ab409f2eaded" (UID: "fd2c44a4-86c6-43b2-9f19-ab409f2eaded"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.879187 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ee4dc6-69fd-4151-9e35-68fde68c500e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0ee4dc6-69fd-4151-9e35-68fde68c500e" (UID: "f0ee4dc6-69fd-4151-9e35-68fde68c500e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.880810 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71b6f993-c4ff-4ea9-9306-5d30c09f0f8c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71b6f993-c4ff-4ea9-9306-5d30c09f0f8c" (UID: "71b6f993-c4ff-4ea9-9306-5d30c09f0f8c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.881763 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b426e4-d393-4260-b66f-28c288ce8e89-kube-api-access-48ch7" (OuterVolumeSpecName: "kube-api-access-48ch7") pod "71b426e4-d393-4260-b66f-28c288ce8e89" (UID: "71b426e4-d393-4260-b66f-28c288ce8e89"). InnerVolumeSpecName "kube-api-access-48ch7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.883238 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ee4dc6-69fd-4151-9e35-68fde68c500e-kube-api-access-cb7dz" (OuterVolumeSpecName: "kube-api-access-cb7dz") pod "f0ee4dc6-69fd-4151-9e35-68fde68c500e" (UID: "f0ee4dc6-69fd-4151-9e35-68fde68c500e"). InnerVolumeSpecName "kube-api-access-cb7dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.886495 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd2c44a4-86c6-43b2-9f19-ab409f2eaded-kube-api-access-zgpmp" (OuterVolumeSpecName: "kube-api-access-zgpmp") pod "fd2c44a4-86c6-43b2-9f19-ab409f2eaded" (UID: "fd2c44a4-86c6-43b2-9f19-ab409f2eaded"). InnerVolumeSpecName "kube-api-access-zgpmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.886805 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54187c3a-449a-420f-a04c-f4e6c6f7fc3f-kube-api-access-9bns8" (OuterVolumeSpecName: "kube-api-access-9bns8") pod "54187c3a-449a-420f-a04c-f4e6c6f7fc3f" (UID: "54187c3a-449a-420f-a04c-f4e6c6f7fc3f"). InnerVolumeSpecName "kube-api-access-9bns8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.888035 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b6f993-c4ff-4ea9-9306-5d30c09f0f8c-kube-api-access-t2pdw" (OuterVolumeSpecName: "kube-api-access-t2pdw") pod "71b6f993-c4ff-4ea9-9306-5d30c09f0f8c" (UID: "71b6f993-c4ff-4ea9-9306-5d30c09f0f8c"). InnerVolumeSpecName "kube-api-access-t2pdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.982012 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71b6f993-c4ff-4ea9-9306-5d30c09f0f8c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.982042 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgpmp\" (UniqueName: \"kubernetes.io/projected/fd2c44a4-86c6-43b2-9f19-ab409f2eaded-kube-api-access-zgpmp\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.982059 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2pdw\" (UniqueName: \"kubernetes.io/projected/71b6f993-c4ff-4ea9-9306-5d30c09f0f8c-kube-api-access-t2pdw\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.982071 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0ee4dc6-69fd-4151-9e35-68fde68c500e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.982083 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd2c44a4-86c6-43b2-9f19-ab409f2eaded-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.982095 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb7dz\" (UniqueName: \"kubernetes.io/projected/f0ee4dc6-69fd-4151-9e35-68fde68c500e-kube-api-access-cb7dz\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.982106 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bns8\" (UniqueName: \"kubernetes.io/projected/54187c3a-449a-420f-a04c-f4e6c6f7fc3f-kube-api-access-9bns8\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.982118 4563 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54187c3a-449a-420f-a04c-f4e6c6f7fc3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:45 crc kubenswrapper[4563]: I1124 09:19:45.982129 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48ch7\" (UniqueName: \"kubernetes.io/projected/71b426e4-d393-4260-b66f-28c288ce8e89-kube-api-access-48ch7\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.236136 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b57487f8-d1f8-4f97-b92e-7385ecc88074","Type":"ContainerStarted","Data":"162bc6096bfda05caaf144a844765ecc45eb91ca84d0dd291728d6874cee3a0a"} Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.240097 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9nwp9" Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.240089 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9nwp9" event={"ID":"71b426e4-d393-4260-b66f-28c288ce8e89","Type":"ContainerDied","Data":"ddc5944ed6a99ebd2242563f21f9e4d2b349560cd808f37819fd0170fae75429"} Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.240218 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddc5944ed6a99ebd2242563f21f9e4d2b349560cd808f37819fd0170fae75429" Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.249662 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9a24-account-create-gdlvv" event={"ID":"71b6f993-c4ff-4ea9-9306-5d30c09f0f8c","Type":"ContainerDied","Data":"8b6990e933da2cce61a802d5675f2789a7feac86269489dff9730ab9da5e6da3"} Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.249704 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b6990e933da2cce61a802d5675f2789a7feac86269489dff9730ab9da5e6da3" Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.249697 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9a24-account-create-gdlvv" Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.251421 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5dqsr" event={"ID":"2cdd6b81-e677-43b6-a627-ac55f41bb1de","Type":"ContainerDied","Data":"02e11f7ed9f8405e3cf10c111af794a64a16d9a10e8e0650eeb56c30adb36d31"} Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.251468 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02e11f7ed9f8405e3cf10c111af794a64a16d9a10e8e0650eeb56c30adb36d31" Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.251529 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5dqsr" Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.263365 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-04d9-account-create-dm4rx" event={"ID":"f0ee4dc6-69fd-4151-9e35-68fde68c500e","Type":"ContainerDied","Data":"d955a94b14b1bf96e9b2ef4fe719ec77e0a9573100378a905d972261f4f2df70"} Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.263432 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d955a94b14b1bf96e9b2ef4fe719ec77e0a9573100378a905d972261f4f2df70" Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.263457 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-04d9-account-create-dm4rx" Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.266919 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.266904893 podStartE2EDuration="3.266904893s" podCreationTimestamp="2025-11-24 09:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:19:46.261125283 +0000 UTC m=+963.520102749" watchObservedRunningTime="2025-11-24 09:19:46.266904893 +0000 UTC m=+963.525882340" Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.266968 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0283-account-create-wklw4" Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.267011 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0283-account-create-wklw4" event={"ID":"fd2c44a4-86c6-43b2-9f19-ab409f2eaded","Type":"ContainerDied","Data":"7ae679599bc794c7c7adbf8bfa5a2c583b5c98caf19f76a5f2f603222487061b"} Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.267058 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ae679599bc794c7c7adbf8bfa5a2c583b5c98caf19f76a5f2f603222487061b" Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.275711 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zzpdk" event={"ID":"54187c3a-449a-420f-a04c-f4e6c6f7fc3f","Type":"ContainerDied","Data":"802ecabd501e2bfc5a5e3242a2eb4a0dd4dddb84047c0bb5cef90e60a0ac845b"} Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.275831 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="802ecabd501e2bfc5a5e3242a2eb4a0dd4dddb84047c0bb5cef90e60a0ac845b" Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.275763 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zzpdk" Nov 24 09:19:46 crc kubenswrapper[4563]: I1124 09:19:46.276783 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.289078 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aeeceac4-ca86-4f39-8871-15ef4052ff3d","Type":"ContainerStarted","Data":"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d"} Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.289557 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.289218 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="sg-core" containerID="cri-o://bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47" gracePeriod=30 Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.289159 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="ceilometer-central-agent" containerID="cri-o://44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29" gracePeriod=30 Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.289252 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="proxy-httpd" containerID="cri-o://813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d" gracePeriod=30 Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.289285 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="ceilometer-notification-agent" containerID="cri-o://1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca" gracePeriod=30 Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.292981 4563 generic.go:334] "Generic (PLEG): container finished" podID="ce3468f6-f565-41d5-ad15-302e10230479" containerID="3304aad3aaefb9843dcbd7d53d407c72fbd21eb60a457f9eb5ea04521218050e" exitCode=0 Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.293051 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce3468f6-f565-41d5-ad15-302e10230479","Type":"ContainerDied","Data":"3304aad3aaefb9843dcbd7d53d407c72fbd21eb60a457f9eb5ea04521218050e"} Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.332062 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.613990033 podStartE2EDuration="10.332046614s" podCreationTimestamp="2025-11-24 09:19:37 +0000 UTC" firstStartedPulling="2025-11-24 09:19:42.39812151 +0000 UTC m=+959.657098958" lastFinishedPulling="2025-11-24 09:19:46.116178092 +0000 UTC m=+963.375155539" observedRunningTime="2025-11-24 09:19:47.316541061 +0000 UTC m=+964.575518508" watchObservedRunningTime="2025-11-24 09:19:47.332046614 +0000 UTC m=+964.591024061" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.536562 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.718613 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tjtv\" (UniqueName: \"kubernetes.io/projected/ce3468f6-f565-41d5-ad15-302e10230479-kube-api-access-4tjtv\") pod \"ce3468f6-f565-41d5-ad15-302e10230479\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.718675 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-public-tls-certs\") pod \"ce3468f6-f565-41d5-ad15-302e10230479\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.719771 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-config-data\") pod \"ce3468f6-f565-41d5-ad15-302e10230479\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.719802 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce3468f6-f565-41d5-ad15-302e10230479-httpd-run\") pod \"ce3468f6-f565-41d5-ad15-302e10230479\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.719848 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-combined-ca-bundle\") pod \"ce3468f6-f565-41d5-ad15-302e10230479\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.719890 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce3468f6-f565-41d5-ad15-302e10230479-logs\") pod \"ce3468f6-f565-41d5-ad15-302e10230479\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.719912 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ce3468f6-f565-41d5-ad15-302e10230479\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.719940 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-scripts\") pod \"ce3468f6-f565-41d5-ad15-302e10230479\" (UID: \"ce3468f6-f565-41d5-ad15-302e10230479\") " Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.720426 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce3468f6-f565-41d5-ad15-302e10230479-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ce3468f6-f565-41d5-ad15-302e10230479" (UID: "ce3468f6-f565-41d5-ad15-302e10230479"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.720678 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce3468f6-f565-41d5-ad15-302e10230479-logs" (OuterVolumeSpecName: "logs") pod "ce3468f6-f565-41d5-ad15-302e10230479" (UID: "ce3468f6-f565-41d5-ad15-302e10230479"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.720991 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce3468f6-f565-41d5-ad15-302e10230479-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.721015 4563 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ce3468f6-f565-41d5-ad15-302e10230479-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.725437 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-scripts" (OuterVolumeSpecName: "scripts") pod "ce3468f6-f565-41d5-ad15-302e10230479" (UID: "ce3468f6-f565-41d5-ad15-302e10230479"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.725876 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce3468f6-f565-41d5-ad15-302e10230479-kube-api-access-4tjtv" (OuterVolumeSpecName: "kube-api-access-4tjtv") pod "ce3468f6-f565-41d5-ad15-302e10230479" (UID: "ce3468f6-f565-41d5-ad15-302e10230479"). InnerVolumeSpecName "kube-api-access-4tjtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.726777 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "ce3468f6-f565-41d5-ad15-302e10230479" (UID: "ce3468f6-f565-41d5-ad15-302e10230479"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.747301 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce3468f6-f565-41d5-ad15-302e10230479" (UID: "ce3468f6-f565-41d5-ad15-302e10230479"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.772053 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ce3468f6-f565-41d5-ad15-302e10230479" (UID: "ce3468f6-f565-41d5-ad15-302e10230479"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.779192 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-config-data" (OuterVolumeSpecName: "config-data") pod "ce3468f6-f565-41d5-ad15-302e10230479" (UID: "ce3468f6-f565-41d5-ad15-302e10230479"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.822956 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.823250 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.823283 4563 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.823292 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.823301 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tjtv\" (UniqueName: \"kubernetes.io/projected/ce3468f6-f565-41d5-ad15-302e10230479-kube-api-access-4tjtv\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.823312 4563 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce3468f6-f565-41d5-ad15-302e10230479-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.838957 4563 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.856484 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:47 crc kubenswrapper[4563]: I1124 09:19:47.924979 4563 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.026559 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-sg-core-conf-yaml\") pod \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.026674 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-scripts\") pod \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.026732 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc5mt\" (UniqueName: \"kubernetes.io/projected/aeeceac4-ca86-4f39-8871-15ef4052ff3d-kube-api-access-lc5mt\") pod \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.026757 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeeceac4-ca86-4f39-8871-15ef4052ff3d-log-httpd\") pod \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.026777 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeeceac4-ca86-4f39-8871-15ef4052ff3d-run-httpd\") pod \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.026869 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-config-data\") pod \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.026930 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-combined-ca-bundle\") pod \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\" (UID: \"aeeceac4-ca86-4f39-8871-15ef4052ff3d\") " Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.027928 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeeceac4-ca86-4f39-8871-15ef4052ff3d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aeeceac4-ca86-4f39-8871-15ef4052ff3d" (UID: "aeeceac4-ca86-4f39-8871-15ef4052ff3d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.028136 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeeceac4-ca86-4f39-8871-15ef4052ff3d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aeeceac4-ca86-4f39-8871-15ef4052ff3d" (UID: "aeeceac4-ca86-4f39-8871-15ef4052ff3d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.033763 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-scripts" (OuterVolumeSpecName: "scripts") pod "aeeceac4-ca86-4f39-8871-15ef4052ff3d" (UID: "aeeceac4-ca86-4f39-8871-15ef4052ff3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.040845 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeeceac4-ca86-4f39-8871-15ef4052ff3d-kube-api-access-lc5mt" (OuterVolumeSpecName: "kube-api-access-lc5mt") pod "aeeceac4-ca86-4f39-8871-15ef4052ff3d" (UID: "aeeceac4-ca86-4f39-8871-15ef4052ff3d"). InnerVolumeSpecName "kube-api-access-lc5mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.057937 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aeeceac4-ca86-4f39-8871-15ef4052ff3d" (UID: "aeeceac4-ca86-4f39-8871-15ef4052ff3d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.094249 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeeceac4-ca86-4f39-8871-15ef4052ff3d" (UID: "aeeceac4-ca86-4f39-8871-15ef4052ff3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.104424 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-config-data" (OuterVolumeSpecName: "config-data") pod "aeeceac4-ca86-4f39-8871-15ef4052ff3d" (UID: "aeeceac4-ca86-4f39-8871-15ef4052ff3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.128813 4563 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.128841 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.128851 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc5mt\" (UniqueName: \"kubernetes.io/projected/aeeceac4-ca86-4f39-8871-15ef4052ff3d-kube-api-access-lc5mt\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.128863 4563 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeeceac4-ca86-4f39-8871-15ef4052ff3d-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.128871 4563 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeeceac4-ca86-4f39-8871-15ef4052ff3d-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.128879 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.128888 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeeceac4-ca86-4f39-8871-15ef4052ff3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.304806 4563 generic.go:334] "Generic (PLEG): container finished" podID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerID="813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d" exitCode=0 Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.304837 4563 generic.go:334] "Generic (PLEG): container finished" podID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerID="bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47" exitCode=2 Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.304844 4563 generic.go:334] "Generic (PLEG): container finished" podID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerID="1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca" exitCode=0 Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.304850 4563 generic.go:334] "Generic (PLEG): container finished" podID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerID="44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29" exitCode=0 Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.304889 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aeeceac4-ca86-4f39-8871-15ef4052ff3d","Type":"ContainerDied","Data":"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d"} Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.304915 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aeeceac4-ca86-4f39-8871-15ef4052ff3d","Type":"ContainerDied","Data":"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47"} Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.304926 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aeeceac4-ca86-4f39-8871-15ef4052ff3d","Type":"ContainerDied","Data":"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca"} Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.304936 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aeeceac4-ca86-4f39-8871-15ef4052ff3d","Type":"ContainerDied","Data":"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29"} Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.304944 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aeeceac4-ca86-4f39-8871-15ef4052ff3d","Type":"ContainerDied","Data":"5076f4e1d49456f48c699e6def7a248cc086b3542f7e39f95d992d87965205aa"} Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.304959 4563 scope.go:117] "RemoveContainer" containerID="813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.305068 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.311017 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ce3468f6-f565-41d5-ad15-302e10230479","Type":"ContainerDied","Data":"4945edf98dff45060f13b65653f7eded9f0af294bf9e293bc7180c54927d3059"} Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.311081 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.326146 4563 scope.go:117] "RemoveContainer" containerID="bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.340690 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.370094 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.373099 4563 scope.go:117] "RemoveContainer" containerID="1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.400148 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.400661 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="proxy-httpd" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.400679 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="proxy-httpd" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.400695 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ee4dc6-69fd-4151-9e35-68fde68c500e" containerName="mariadb-account-create" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.400700 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ee4dc6-69fd-4151-9e35-68fde68c500e" containerName="mariadb-account-create" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.400715 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b6f993-c4ff-4ea9-9306-5d30c09f0f8c" containerName="mariadb-account-create" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.400721 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b6f993-c4ff-4ea9-9306-5d30c09f0f8c" containerName="mariadb-account-create" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.400741 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd2c44a4-86c6-43b2-9f19-ab409f2eaded" containerName="mariadb-account-create" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.400747 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd2c44a4-86c6-43b2-9f19-ab409f2eaded" containerName="mariadb-account-create" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.400753 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b426e4-d393-4260-b66f-28c288ce8e89" containerName="mariadb-database-create" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.400758 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b426e4-d393-4260-b66f-28c288ce8e89" containerName="mariadb-database-create" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.400765 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="ceilometer-notification-agent" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.400787 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="ceilometer-notification-agent" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.400796 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdd6b81-e677-43b6-a627-ac55f41bb1de" containerName="mariadb-database-create" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.400802 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdd6b81-e677-43b6-a627-ac55f41bb1de" containerName="mariadb-database-create" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.400819 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="ceilometer-central-agent" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.400824 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="ceilometer-central-agent" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.400849 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54187c3a-449a-420f-a04c-f4e6c6f7fc3f" containerName="mariadb-database-create" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.400855 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="54187c3a-449a-420f-a04c-f4e6c6f7fc3f" containerName="mariadb-database-create" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.400865 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3468f6-f565-41d5-ad15-302e10230479" containerName="glance-httpd" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.400872 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3468f6-f565-41d5-ad15-302e10230479" containerName="glance-httpd" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.400880 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3468f6-f565-41d5-ad15-302e10230479" containerName="glance-log" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.400886 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3468f6-f565-41d5-ad15-302e10230479" containerName="glance-log" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.400893 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="sg-core" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.400898 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="sg-core" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.401055 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce3468f6-f565-41d5-ad15-302e10230479" containerName="glance-httpd" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.401066 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ee4dc6-69fd-4151-9e35-68fde68c500e" containerName="mariadb-account-create" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.401075 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="proxy-httpd" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.401087 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b6f993-c4ff-4ea9-9306-5d30c09f0f8c" containerName="mariadb-account-create" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.401095 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="sg-core" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.401107 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="ceilometer-central-agent" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.401117 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b426e4-d393-4260-b66f-28c288ce8e89" containerName="mariadb-database-create" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.401125 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce3468f6-f565-41d5-ad15-302e10230479" containerName="glance-log" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.401132 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cdd6b81-e677-43b6-a627-ac55f41bb1de" containerName="mariadb-database-create" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.401143 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" containerName="ceilometer-notification-agent" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.401149 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd2c44a4-86c6-43b2-9f19-ab409f2eaded" containerName="mariadb-account-create" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.401156 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="54187c3a-449a-420f-a04c-f4e6c6f7fc3f" containerName="mariadb-database-create" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.403945 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.405986 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.406140 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.413932 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.417053 4563 scope.go:117] "RemoveContainer" containerID="44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.422523 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.430164 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.436529 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.438257 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.440195 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.441571 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.442783 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.442881 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f583c7a2-35a4-4338-b2c4-f069dd971290-run-httpd\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.442973 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f583c7a2-35a4-4338-b2c4-f069dd971290-log-httpd\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.443033 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gv8l\" (UniqueName: \"kubernetes.io/projected/f583c7a2-35a4-4338-b2c4-f069dd971290-kube-api-access-7gv8l\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.443193 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-config-data\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.443281 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-scripts\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.443395 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.445069 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.477393 4563 scope.go:117] "RemoveContainer" containerID="813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.478044 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d\": container with ID starting with 813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d not found: ID does not exist" containerID="813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.478101 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d"} err="failed to get container status \"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d\": rpc error: code = NotFound desc = could not find container \"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d\": container with ID starting with 813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.478128 4563 scope.go:117] "RemoveContainer" containerID="bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.478420 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47\": container with ID starting with bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47 not found: ID does not exist" containerID="bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.478453 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47"} err="failed to get container status \"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47\": rpc error: code = NotFound desc = could not find container \"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47\": container with ID starting with bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47 not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.478476 4563 scope.go:117] "RemoveContainer" containerID="1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.478809 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca\": container with ID starting with 1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca not found: ID does not exist" containerID="1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.478833 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca"} err="failed to get container status \"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca\": rpc error: code = NotFound desc = could not find container \"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca\": container with ID starting with 1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.478849 4563 scope.go:117] "RemoveContainer" containerID="44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29" Nov 24 09:19:48 crc kubenswrapper[4563]: E1124 09:19:48.479230 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29\": container with ID starting with 44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29 not found: ID does not exist" containerID="44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.479260 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29"} err="failed to get container status \"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29\": rpc error: code = NotFound desc = could not find container \"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29\": container with ID starting with 44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29 not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.479287 4563 scope.go:117] "RemoveContainer" containerID="813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.479746 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d"} err="failed to get container status \"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d\": rpc error: code = NotFound desc = could not find container \"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d\": container with ID starting with 813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.479772 4563 scope.go:117] "RemoveContainer" containerID="bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.480295 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47"} err="failed to get container status \"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47\": rpc error: code = NotFound desc = could not find container \"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47\": container with ID starting with bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47 not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.480320 4563 scope.go:117] "RemoveContainer" containerID="1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.480661 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca"} err="failed to get container status \"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca\": rpc error: code = NotFound desc = could not find container \"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca\": container with ID starting with 1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.480682 4563 scope.go:117] "RemoveContainer" containerID="44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.481750 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29"} err="failed to get container status \"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29\": rpc error: code = NotFound desc = could not find container \"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29\": container with ID starting with 44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29 not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.482191 4563 scope.go:117] "RemoveContainer" containerID="813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.482490 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d"} err="failed to get container status \"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d\": rpc error: code = NotFound desc = could not find container \"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d\": container with ID starting with 813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.482609 4563 scope.go:117] "RemoveContainer" containerID="bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.482874 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47"} err="failed to get container status \"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47\": rpc error: code = NotFound desc = could not find container \"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47\": container with ID starting with bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47 not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.482893 4563 scope.go:117] "RemoveContainer" containerID="1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.483095 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca"} err="failed to get container status \"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca\": rpc error: code = NotFound desc = could not find container \"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca\": container with ID starting with 1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.483113 4563 scope.go:117] "RemoveContainer" containerID="44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.483325 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29"} err="failed to get container status \"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29\": rpc error: code = NotFound desc = could not find container \"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29\": container with ID starting with 44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29 not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.483341 4563 scope.go:117] "RemoveContainer" containerID="813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.483791 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d"} err="failed to get container status \"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d\": rpc error: code = NotFound desc = could not find container \"813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d\": container with ID starting with 813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.483843 4563 scope.go:117] "RemoveContainer" containerID="bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.485489 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47"} err="failed to get container status \"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47\": rpc error: code = NotFound desc = could not find container \"bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47\": container with ID starting with bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47 not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.485515 4563 scope.go:117] "RemoveContainer" containerID="1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.485814 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca"} err="failed to get container status \"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca\": rpc error: code = NotFound desc = could not find container \"1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca\": container with ID starting with 1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.485906 4563 scope.go:117] "RemoveContainer" containerID="44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.486150 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29"} err="failed to get container status \"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29\": rpc error: code = NotFound desc = could not find container \"44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29\": container with ID starting with 44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29 not found: ID does not exist" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.486170 4563 scope.go:117] "RemoveContainer" containerID="3304aad3aaefb9843dcbd7d53d407c72fbd21eb60a457f9eb5ea04521218050e" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.517999 4563 scope.go:117] "RemoveContainer" containerID="e4972f237565195a76103efe1025e783d7e0d1366c6f017de9154a12791baa96" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.545263 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gv8l\" (UniqueName: \"kubernetes.io/projected/f583c7a2-35a4-4338-b2c4-f069dd971290-kube-api-access-7gv8l\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.545301 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f583c7a2-35a4-4338-b2c4-f069dd971290-log-httpd\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.545327 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6624aa1-6acc-43a1-944e-20a77c1b09d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.545357 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6624aa1-6acc-43a1-944e-20a77c1b09d9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.545672 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-config-data\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.545729 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-scripts\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.545759 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6624aa1-6acc-43a1-944e-20a77c1b09d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.545802 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.545829 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6624aa1-6acc-43a1-944e-20a77c1b09d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.546005 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.546068 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6624aa1-6acc-43a1-944e-20a77c1b09d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.546105 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f583c7a2-35a4-4338-b2c4-f069dd971290-log-httpd\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.546162 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f583c7a2-35a4-4338-b2c4-f069dd971290-run-httpd\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.546441 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f583c7a2-35a4-4338-b2c4-f069dd971290-run-httpd\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.546366 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.546502 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6624aa1-6acc-43a1-944e-20a77c1b09d9-logs\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.546681 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d45pw\" (UniqueName: \"kubernetes.io/projected/f6624aa1-6acc-43a1-944e-20a77c1b09d9-kube-api-access-d45pw\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.550488 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-config-data\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.551737 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.553851 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-scripts\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.561713 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gv8l\" (UniqueName: \"kubernetes.io/projected/f583c7a2-35a4-4338-b2c4-f069dd971290-kube-api-access-7gv8l\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.563304 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.648918 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6624aa1-6acc-43a1-944e-20a77c1b09d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.649058 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6624aa1-6acc-43a1-944e-20a77c1b09d9-logs\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.649156 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d45pw\" (UniqueName: \"kubernetes.io/projected/f6624aa1-6acc-43a1-944e-20a77c1b09d9-kube-api-access-d45pw\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.649217 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6624aa1-6acc-43a1-944e-20a77c1b09d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.649245 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6624aa1-6acc-43a1-944e-20a77c1b09d9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.649356 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6624aa1-6acc-43a1-944e-20a77c1b09d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.649397 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.649451 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6624aa1-6acc-43a1-944e-20a77c1b09d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.649824 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6624aa1-6acc-43a1-944e-20a77c1b09d9-logs\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.649994 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.650025 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f6624aa1-6acc-43a1-944e-20a77c1b09d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.657541 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6624aa1-6acc-43a1-944e-20a77c1b09d9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.657786 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6624aa1-6acc-43a1-944e-20a77c1b09d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.658095 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6624aa1-6acc-43a1-944e-20a77c1b09d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.658993 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6624aa1-6acc-43a1-944e-20a77c1b09d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.664157 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d45pw\" (UniqueName: \"kubernetes.io/projected/f6624aa1-6acc-43a1-944e-20a77c1b09d9-kube-api-access-d45pw\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.675351 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"f6624aa1-6acc-43a1-944e-20a77c1b09d9\") " pod="openstack/glance-default-external-api-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.778895 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:48 crc kubenswrapper[4563]: I1124 09:19:48.786486 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 24 09:19:49 crc kubenswrapper[4563]: I1124 09:19:49.064590 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeeceac4-ca86-4f39-8871-15ef4052ff3d" path="/var/lib/kubelet/pods/aeeceac4-ca86-4f39-8871-15ef4052ff3d/volumes" Nov 24 09:19:49 crc kubenswrapper[4563]: I1124 09:19:49.065561 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce3468f6-f565-41d5-ad15-302e10230479" path="/var/lib/kubelet/pods/ce3468f6-f565-41d5-ad15-302e10230479/volumes" Nov 24 09:19:49 crc kubenswrapper[4563]: I1124 09:19:49.274085 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:49 crc kubenswrapper[4563]: I1124 09:19:49.321069 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f583c7a2-35a4-4338-b2c4-f069dd971290","Type":"ContainerStarted","Data":"f3651c47d5a58a046d36d14cd8322d8efac93a98fe9ddadcc8f3e5af772220d4"} Nov 24 09:19:49 crc kubenswrapper[4563]: I1124 09:19:49.421525 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 24 09:19:49 crc kubenswrapper[4563]: W1124 09:19:49.766887 4563 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54187c3a_449a_420f_a04c_f4e6c6f7fc3f.slice/crio-conmon-3c26adbc8059ff965ca66c7fa73df912c4211f8febaa28943e3cffd7423ac3c5.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54187c3a_449a_420f_a04c_f4e6c6f7fc3f.slice/crio-conmon-3c26adbc8059ff965ca66c7fa73df912c4211f8febaa28943e3cffd7423ac3c5.scope: no such file or directory Nov 24 09:19:49 crc kubenswrapper[4563]: W1124 09:19:49.767018 4563 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54187c3a_449a_420f_a04c_f4e6c6f7fc3f.slice/crio-3c26adbc8059ff965ca66c7fa73df912c4211f8febaa28943e3cffd7423ac3c5.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54187c3a_449a_420f_a04c_f4e6c6f7fc3f.slice/crio-3c26adbc8059ff965ca66c7fa73df912c4211f8febaa28943e3cffd7423ac3c5.scope: no such file or directory Nov 24 09:19:49 crc kubenswrapper[4563]: W1124 09:19:49.774545 4563 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice/crio-conmon-1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice/crio-conmon-1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca.scope: no such file or directory Nov 24 09:19:49 crc kubenswrapper[4563]: W1124 09:19:49.783786 4563 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice/crio-1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice/crio-1881513342d6343db6aa06d5fe7af23d7d65ea645bf3cdf857bb212f47bcb8ca.scope: no such file or directory Nov 24 09:19:49 crc kubenswrapper[4563]: W1124 09:19:49.786173 4563 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice/crio-conmon-bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice/crio-conmon-bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47.scope: no such file or directory Nov 24 09:19:49 crc kubenswrapper[4563]: W1124 09:19:49.786206 4563 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice/crio-bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice/crio-bdb0374df03abdca35f622ada6d81387d4b929eb6d8ae78aecfcd4b580d0fd47.scope: no such file or directory Nov 24 09:19:49 crc kubenswrapper[4563]: W1124 09:19:49.786429 4563 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice/crio-conmon-813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice/crio-conmon-813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d.scope: no such file or directory Nov 24 09:19:49 crc kubenswrapper[4563]: W1124 09:19:49.786456 4563 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice/crio-813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice/crio-813603d882e34415569f555c72be43792d25c573846a92f148940557c2a4af8d.scope: no such file or directory Nov 24 09:19:49 crc kubenswrapper[4563]: E1124 09:19:49.988057 4563 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71b426e4_d393_4260_b66f_28c288ce8e89.slice/crio-ddc5944ed6a99ebd2242563f21f9e4d2b349560cd808f37819fd0170fae75429\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce3468f6_f565_41d5_ad15_302e10230479.slice/crio-e4972f237565195a76103efe1025e783d7e0d1366c6f017de9154a12791baa96.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0ee4dc6_69fd_4151_9e35_68fde68c500e.slice/crio-conmon-7bd613c1e631f06f8c99dc293308cdd878b574255f51170c4e79e7db1a5cba3f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce3468f6_f565_41d5_ad15_302e10230479.slice/crio-conmon-3304aad3aaefb9843dcbd7d53d407c72fbd21eb60a457f9eb5ea04521218050e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce3468f6_f565_41d5_ad15_302e10230479.slice/crio-conmon-e4972f237565195a76103efe1025e783d7e0d1366c6f017de9154a12791baa96.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce3468f6_f565_41d5_ad15_302e10230479.slice/crio-4945edf98dff45060f13b65653f7eded9f0af294bf9e293bc7180c54927d3059\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd2c44a4_86c6_43b2_9f19_ab409f2eaded.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cdd6b81_e677_43b6_a627_ac55f41bb1de.slice/crio-02e11f7ed9f8405e3cf10c111af794a64a16d9a10e8e0650eeb56c30adb36d31\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce3468f6_f565_41d5_ad15_302e10230479.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ec5b651_57ef_414b_8c8e_4b488d71663f.slice/crio-conmon-babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd2c44a4_86c6_43b2_9f19_ab409f2eaded.slice/crio-conmon-6f2af6eac37517ee8e9aee60a7627d32c3557a461b9321cebbc2c5326f4e2fa2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice/crio-conmon-44c2ec53ff80cdecbfeb685bde0517afa0463cd95ef78efb84ea97ca83963a29.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice/crio-5076f4e1d49456f48c699e6def7a248cc086b3542f7e39f95d992d87965205aa\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ec5b651_57ef_414b_8c8e_4b488d71663f.slice/crio-babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd2c44a4_86c6_43b2_9f19_ab409f2eaded.slice/crio-7ae679599bc794c7c7adbf8bfa5a2c583b5c98caf19f76a5f2f603222487061b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71b6f993_c4ff_4ea9_9306_5d30c09f0f8c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0ee4dc6_69fd_4151_9e35_68fde68c500e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71b6f993_c4ff_4ea9_9306_5d30c09f0f8c.slice/crio-8b6990e933da2cce61a802d5675f2789a7feac86269489dff9730ab9da5e6da3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cdd6b81_e677_43b6_a627_ac55f41bb1de.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd2c44a4_86c6_43b2_9f19_ab409f2eaded.slice/crio-6f2af6eac37517ee8e9aee60a7627d32c3557a461b9321cebbc2c5326f4e2fa2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71b426e4_d393_4260_b66f_28c288ce8e89.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cdd6b81_e677_43b6_a627_ac55f41bb1de.slice/crio-fdf38bf1b96efef1bc076e890ef650066b9626aa1431bc638ade4e93c4df030b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeeceac4_ca86_4f39_8871_15ef4052ff3d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0ee4dc6_69fd_4151_9e35_68fde68c500e.slice/crio-d955a94b14b1bf96e9b2ef4fe719ec77e0a9573100378a905d972261f4f2df70\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54187c3a_449a_420f_a04c_f4e6c6f7fc3f.slice/crio-802ecabd501e2bfc5a5e3242a2eb4a0dd4dddb84047c0bb5cef90e60a0ac845b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce3468f6_f565_41d5_ad15_302e10230479.slice/crio-3304aad3aaefb9843dcbd7d53d407c72fbd21eb60a457f9eb5ea04521218050e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cdd6b81_e677_43b6_a627_ac55f41bb1de.slice/crio-conmon-fdf38bf1b96efef1bc076e890ef650066b9626aa1431bc638ade4e93c4df030b.scope\": RecentStats: unable to find data in memory cache]" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.033621 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.185217 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-combined-ca-bundle\") pod \"0ec5b651-57ef-414b-8c8e-4b488d71663f\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.185263 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ec5b651-57ef-414b-8c8e-4b488d71663f-logs\") pod \"0ec5b651-57ef-414b-8c8e-4b488d71663f\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.185303 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-horizon-tls-certs\") pod \"0ec5b651-57ef-414b-8c8e-4b488d71663f\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.185346 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2sch\" (UniqueName: \"kubernetes.io/projected/0ec5b651-57ef-414b-8c8e-4b488d71663f-kube-api-access-d2sch\") pod \"0ec5b651-57ef-414b-8c8e-4b488d71663f\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.185700 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ec5b651-57ef-414b-8c8e-4b488d71663f-config-data\") pod \"0ec5b651-57ef-414b-8c8e-4b488d71663f\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.185731 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ec5b651-57ef-414b-8c8e-4b488d71663f-scripts\") pod \"0ec5b651-57ef-414b-8c8e-4b488d71663f\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.185791 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-horizon-secret-key\") pod \"0ec5b651-57ef-414b-8c8e-4b488d71663f\" (UID: \"0ec5b651-57ef-414b-8c8e-4b488d71663f\") " Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.185836 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ec5b651-57ef-414b-8c8e-4b488d71663f-logs" (OuterVolumeSpecName: "logs") pod "0ec5b651-57ef-414b-8c8e-4b488d71663f" (UID: "0ec5b651-57ef-414b-8c8e-4b488d71663f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.186551 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ec5b651-57ef-414b-8c8e-4b488d71663f-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.191331 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ec5b651-57ef-414b-8c8e-4b488d71663f-kube-api-access-d2sch" (OuterVolumeSpecName: "kube-api-access-d2sch") pod "0ec5b651-57ef-414b-8c8e-4b488d71663f" (UID: "0ec5b651-57ef-414b-8c8e-4b488d71663f"). InnerVolumeSpecName "kube-api-access-d2sch". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.194073 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0ec5b651-57ef-414b-8c8e-4b488d71663f" (UID: "0ec5b651-57ef-414b-8c8e-4b488d71663f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.215266 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ec5b651-57ef-414b-8c8e-4b488d71663f-scripts" (OuterVolumeSpecName: "scripts") pod "0ec5b651-57ef-414b-8c8e-4b488d71663f" (UID: "0ec5b651-57ef-414b-8c8e-4b488d71663f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.217501 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ec5b651-57ef-414b-8c8e-4b488d71663f" (UID: "0ec5b651-57ef-414b-8c8e-4b488d71663f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.224530 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ec5b651-57ef-414b-8c8e-4b488d71663f-config-data" (OuterVolumeSpecName: "config-data") pod "0ec5b651-57ef-414b-8c8e-4b488d71663f" (UID: "0ec5b651-57ef-414b-8c8e-4b488d71663f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.237945 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "0ec5b651-57ef-414b-8c8e-4b488d71663f" (UID: "0ec5b651-57ef-414b-8c8e-4b488d71663f"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.287727 4563 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.287758 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.287768 4563 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ec5b651-57ef-414b-8c8e-4b488d71663f-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.287777 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2sch\" (UniqueName: \"kubernetes.io/projected/0ec5b651-57ef-414b-8c8e-4b488d71663f-kube-api-access-d2sch\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.287786 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ec5b651-57ef-414b-8c8e-4b488d71663f-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.287794 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ec5b651-57ef-414b-8c8e-4b488d71663f-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.336493 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f6624aa1-6acc-43a1-944e-20a77c1b09d9","Type":"ContainerStarted","Data":"8b9b7a19cba3bcbb5a9a7ac01c3a876ade0b7710d8d7832da8d37a402c51c26f"} Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.336535 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f6624aa1-6acc-43a1-944e-20a77c1b09d9","Type":"ContainerStarted","Data":"1bbf2038c3cbc59f4cc34ab84b9aa5f08bcf960bb76a9a3e942bc308a3a008a7"} Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.338178 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f583c7a2-35a4-4338-b2c4-f069dd971290","Type":"ContainerStarted","Data":"4981168a34838ff03c867b84b3fd6379a315ed30af3944da659cb7cb34e064c3"} Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.340167 4563 generic.go:334] "Generic (PLEG): container finished" podID="0ec5b651-57ef-414b-8c8e-4b488d71663f" containerID="babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209" exitCode=137 Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.340206 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd5c59c66-hrmf5" event={"ID":"0ec5b651-57ef-414b-8c8e-4b488d71663f","Type":"ContainerDied","Data":"babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209"} Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.340215 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cd5c59c66-hrmf5" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.340242 4563 scope.go:117] "RemoveContainer" containerID="580a67c88349862289ccfdb112589708d48749b86e526a80925a6ba1dc67dab0" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.340230 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd5c59c66-hrmf5" event={"ID":"0ec5b651-57ef-414b-8c8e-4b488d71663f","Type":"ContainerDied","Data":"3fabc59ee478fa8ffa39b7f3bb0c8faa2e304247a0963af93e364c189c2d2cb3"} Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.382041 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cd5c59c66-hrmf5"] Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.399107 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7cd5c59c66-hrmf5"] Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.474762 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rgwhs"] Nov 24 09:19:50 crc kubenswrapper[4563]: E1124 09:19:50.475453 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec5b651-57ef-414b-8c8e-4b488d71663f" containerName="horizon-log" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.475465 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec5b651-57ef-414b-8c8e-4b488d71663f" containerName="horizon-log" Nov 24 09:19:50 crc kubenswrapper[4563]: E1124 09:19:50.475499 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec5b651-57ef-414b-8c8e-4b488d71663f" containerName="horizon" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.475504 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec5b651-57ef-414b-8c8e-4b488d71663f" containerName="horizon" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.475973 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec5b651-57ef-414b-8c8e-4b488d71663f" containerName="horizon" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.476010 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec5b651-57ef-414b-8c8e-4b488d71663f" containerName="horizon-log" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.476666 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.477371 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rgwhs"] Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.478945 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.479236 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-clr79" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.479353 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.565610 4563 scope.go:117] "RemoveContainer" containerID="babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.598215 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-config-data\") pod \"nova-cell0-conductor-db-sync-rgwhs\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.598330 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rgwhs\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.598446 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-scripts\") pod \"nova-cell0-conductor-db-sync-rgwhs\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.598483 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwglj\" (UniqueName: \"kubernetes.io/projected/8c326972-9b5f-4f8c-b71d-6811d65b31e1-kube-api-access-hwglj\") pod \"nova-cell0-conductor-db-sync-rgwhs\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.602679 4563 scope.go:117] "RemoveContainer" containerID="580a67c88349862289ccfdb112589708d48749b86e526a80925a6ba1dc67dab0" Nov 24 09:19:50 crc kubenswrapper[4563]: E1124 09:19:50.602984 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"580a67c88349862289ccfdb112589708d48749b86e526a80925a6ba1dc67dab0\": container with ID starting with 580a67c88349862289ccfdb112589708d48749b86e526a80925a6ba1dc67dab0 not found: ID does not exist" containerID="580a67c88349862289ccfdb112589708d48749b86e526a80925a6ba1dc67dab0" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.603017 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"580a67c88349862289ccfdb112589708d48749b86e526a80925a6ba1dc67dab0"} err="failed to get container status \"580a67c88349862289ccfdb112589708d48749b86e526a80925a6ba1dc67dab0\": rpc error: code = NotFound desc = could not find container \"580a67c88349862289ccfdb112589708d48749b86e526a80925a6ba1dc67dab0\": container with ID starting with 580a67c88349862289ccfdb112589708d48749b86e526a80925a6ba1dc67dab0 not found: ID does not exist" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.603039 4563 scope.go:117] "RemoveContainer" containerID="babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209" Nov 24 09:19:50 crc kubenswrapper[4563]: E1124 09:19:50.603263 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209\": container with ID starting with babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209 not found: ID does not exist" containerID="babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.603288 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209"} err="failed to get container status \"babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209\": rpc error: code = NotFound desc = could not find container \"babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209\": container with ID starting with babd8f40dad9a932d16d97dcc9b930bba3d8e659d2c29350a0aba7329d1d2209 not found: ID does not exist" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.700060 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-scripts\") pod \"nova-cell0-conductor-db-sync-rgwhs\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.700108 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwglj\" (UniqueName: \"kubernetes.io/projected/8c326972-9b5f-4f8c-b71d-6811d65b31e1-kube-api-access-hwglj\") pod \"nova-cell0-conductor-db-sync-rgwhs\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.700167 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-config-data\") pod \"nova-cell0-conductor-db-sync-rgwhs\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.700227 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rgwhs\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.704895 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rgwhs\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.705333 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-scripts\") pod \"nova-cell0-conductor-db-sync-rgwhs\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.707174 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-config-data\") pod \"nova-cell0-conductor-db-sync-rgwhs\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.725437 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwglj\" (UniqueName: \"kubernetes.io/projected/8c326972-9b5f-4f8c-b71d-6811d65b31e1-kube-api-access-hwglj\") pod \"nova-cell0-conductor-db-sync-rgwhs\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.811664 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:19:50 crc kubenswrapper[4563]: I1124 09:19:50.947321 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:51 crc kubenswrapper[4563]: I1124 09:19:51.074564 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ec5b651-57ef-414b-8c8e-4b488d71663f" path="/var/lib/kubelet/pods/0ec5b651-57ef-414b-8c8e-4b488d71663f/volumes" Nov 24 09:19:51 crc kubenswrapper[4563]: W1124 09:19:51.251900 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c326972_9b5f_4f8c_b71d_6811d65b31e1.slice/crio-2967add47666fe873c918e515176bc69710e5be87eff8eca34480859284ba562 WatchSource:0}: Error finding container 2967add47666fe873c918e515176bc69710e5be87eff8eca34480859284ba562: Status 404 returned error can't find the container with id 2967add47666fe873c918e515176bc69710e5be87eff8eca34480859284ba562 Nov 24 09:19:51 crc kubenswrapper[4563]: I1124 09:19:51.252799 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rgwhs"] Nov 24 09:19:51 crc kubenswrapper[4563]: I1124 09:19:51.349224 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rgwhs" event={"ID":"8c326972-9b5f-4f8c-b71d-6811d65b31e1","Type":"ContainerStarted","Data":"2967add47666fe873c918e515176bc69710e5be87eff8eca34480859284ba562"} Nov 24 09:19:51 crc kubenswrapper[4563]: I1124 09:19:51.352129 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f6624aa1-6acc-43a1-944e-20a77c1b09d9","Type":"ContainerStarted","Data":"93e44b8a7adcebcd2589bba3b1a5a008b421c7023d59c92df21650eb874e19b9"} Nov 24 09:19:51 crc kubenswrapper[4563]: I1124 09:19:51.353747 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f583c7a2-35a4-4338-b2c4-f069dd971290","Type":"ContainerStarted","Data":"dac9b5d69811900e3bfb1caf853a34390014ef24d9d606202438d294088696c3"} Nov 24 09:19:51 crc kubenswrapper[4563]: I1124 09:19:51.381174 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.381162184 podStartE2EDuration="3.381162184s" podCreationTimestamp="2025-11-24 09:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:19:51.375312471 +0000 UTC m=+968.634289919" watchObservedRunningTime="2025-11-24 09:19:51.381162184 +0000 UTC m=+968.640139631" Nov 24 09:19:52 crc kubenswrapper[4563]: I1124 09:19:52.381751 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f583c7a2-35a4-4338-b2c4-f069dd971290","Type":"ContainerStarted","Data":"922516add2fdedb8af7768b7e3677fe01a4367146ea6219861478b86f27023cb"} Nov 24 09:19:52 crc kubenswrapper[4563]: I1124 09:19:52.613899 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:52 crc kubenswrapper[4563]: I1124 09:19:52.621614 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5487cdb76f-rn9rx" Nov 24 09:19:53 crc kubenswrapper[4563]: I1124 09:19:53.393225 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f583c7a2-35a4-4338-b2c4-f069dd971290","Type":"ContainerStarted","Data":"73edaab2504f2cd89e195d68f32fe1912a83fbf60133f5d4ee50dc90b5a36b7a"} Nov 24 09:19:53 crc kubenswrapper[4563]: I1124 09:19:53.393531 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="ceilometer-central-agent" containerID="cri-o://4981168a34838ff03c867b84b3fd6379a315ed30af3944da659cb7cb34e064c3" gracePeriod=30 Nov 24 09:19:53 crc kubenswrapper[4563]: I1124 09:19:53.393725 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="proxy-httpd" containerID="cri-o://73edaab2504f2cd89e195d68f32fe1912a83fbf60133f5d4ee50dc90b5a36b7a" gracePeriod=30 Nov 24 09:19:53 crc kubenswrapper[4563]: I1124 09:19:53.393794 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="sg-core" containerID="cri-o://922516add2fdedb8af7768b7e3677fe01a4367146ea6219861478b86f27023cb" gracePeriod=30 Nov 24 09:19:53 crc kubenswrapper[4563]: I1124 09:19:53.393825 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="ceilometer-notification-agent" containerID="cri-o://dac9b5d69811900e3bfb1caf853a34390014ef24d9d606202438d294088696c3" gracePeriod=30 Nov 24 09:19:53 crc kubenswrapper[4563]: I1124 09:19:53.414760 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.848983941 podStartE2EDuration="5.414746794s" podCreationTimestamp="2025-11-24 09:19:48 +0000 UTC" firstStartedPulling="2025-11-24 09:19:49.284216014 +0000 UTC m=+966.543193462" lastFinishedPulling="2025-11-24 09:19:52.849978868 +0000 UTC m=+970.108956315" observedRunningTime="2025-11-24 09:19:53.411538392 +0000 UTC m=+970.670515839" watchObservedRunningTime="2025-11-24 09:19:53.414746794 +0000 UTC m=+970.673724231" Nov 24 09:19:53 crc kubenswrapper[4563]: I1124 09:19:53.574988 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 24 09:19:53 crc kubenswrapper[4563]: I1124 09:19:53.575158 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 24 09:19:53 crc kubenswrapper[4563]: I1124 09:19:53.610325 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 24 09:19:53 crc kubenswrapper[4563]: I1124 09:19:53.622818 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 24 09:19:54 crc kubenswrapper[4563]: I1124 09:19:54.412452 4563 generic.go:334] "Generic (PLEG): container finished" podID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerID="73edaab2504f2cd89e195d68f32fe1912a83fbf60133f5d4ee50dc90b5a36b7a" exitCode=0 Nov 24 09:19:54 crc kubenswrapper[4563]: I1124 09:19:54.412793 4563 generic.go:334] "Generic (PLEG): container finished" podID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerID="922516add2fdedb8af7768b7e3677fe01a4367146ea6219861478b86f27023cb" exitCode=2 Nov 24 09:19:54 crc kubenswrapper[4563]: I1124 09:19:54.412806 4563 generic.go:334] "Generic (PLEG): container finished" podID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerID="dac9b5d69811900e3bfb1caf853a34390014ef24d9d606202438d294088696c3" exitCode=0 Nov 24 09:19:54 crc kubenswrapper[4563]: I1124 09:19:54.412816 4563 generic.go:334] "Generic (PLEG): container finished" podID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerID="4981168a34838ff03c867b84b3fd6379a315ed30af3944da659cb7cb34e064c3" exitCode=0 Nov 24 09:19:54 crc kubenswrapper[4563]: I1124 09:19:54.412528 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f583c7a2-35a4-4338-b2c4-f069dd971290","Type":"ContainerDied","Data":"73edaab2504f2cd89e195d68f32fe1912a83fbf60133f5d4ee50dc90b5a36b7a"} Nov 24 09:19:54 crc kubenswrapper[4563]: I1124 09:19:54.412894 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f583c7a2-35a4-4338-b2c4-f069dd971290","Type":"ContainerDied","Data":"922516add2fdedb8af7768b7e3677fe01a4367146ea6219861478b86f27023cb"} Nov 24 09:19:54 crc kubenswrapper[4563]: I1124 09:19:54.412907 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f583c7a2-35a4-4338-b2c4-f069dd971290","Type":"ContainerDied","Data":"dac9b5d69811900e3bfb1caf853a34390014ef24d9d606202438d294088696c3"} Nov 24 09:19:54 crc kubenswrapper[4563]: I1124 09:19:54.412926 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f583c7a2-35a4-4338-b2c4-f069dd971290","Type":"ContainerDied","Data":"4981168a34838ff03c867b84b3fd6379a315ed30af3944da659cb7cb34e064c3"} Nov 24 09:19:54 crc kubenswrapper[4563]: I1124 09:19:54.413143 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 24 09:19:54 crc kubenswrapper[4563]: I1124 09:19:54.413201 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 24 09:19:56 crc kubenswrapper[4563]: I1124 09:19:56.054507 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 24 09:19:56 crc kubenswrapper[4563]: I1124 09:19:56.059555 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.236314 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.368936 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f583c7a2-35a4-4338-b2c4-f069dd971290-run-httpd\") pod \"f583c7a2-35a4-4338-b2c4-f069dd971290\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.369170 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gv8l\" (UniqueName: \"kubernetes.io/projected/f583c7a2-35a4-4338-b2c4-f069dd971290-kube-api-access-7gv8l\") pod \"f583c7a2-35a4-4338-b2c4-f069dd971290\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.369326 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f583c7a2-35a4-4338-b2c4-f069dd971290-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f583c7a2-35a4-4338-b2c4-f069dd971290" (UID: "f583c7a2-35a4-4338-b2c4-f069dd971290"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.369338 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-scripts\") pod \"f583c7a2-35a4-4338-b2c4-f069dd971290\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.369712 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-config-data\") pod \"f583c7a2-35a4-4338-b2c4-f069dd971290\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.369778 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-combined-ca-bundle\") pod \"f583c7a2-35a4-4338-b2c4-f069dd971290\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.369805 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-sg-core-conf-yaml\") pod \"f583c7a2-35a4-4338-b2c4-f069dd971290\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.369841 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f583c7a2-35a4-4338-b2c4-f069dd971290-log-httpd\") pod \"f583c7a2-35a4-4338-b2c4-f069dd971290\" (UID: \"f583c7a2-35a4-4338-b2c4-f069dd971290\") " Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.370208 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f583c7a2-35a4-4338-b2c4-f069dd971290-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f583c7a2-35a4-4338-b2c4-f069dd971290" (UID: "f583c7a2-35a4-4338-b2c4-f069dd971290"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.370978 4563 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f583c7a2-35a4-4338-b2c4-f069dd971290-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.371005 4563 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f583c7a2-35a4-4338-b2c4-f069dd971290-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.375137 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-scripts" (OuterVolumeSpecName: "scripts") pod "f583c7a2-35a4-4338-b2c4-f069dd971290" (UID: "f583c7a2-35a4-4338-b2c4-f069dd971290"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.375312 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f583c7a2-35a4-4338-b2c4-f069dd971290-kube-api-access-7gv8l" (OuterVolumeSpecName: "kube-api-access-7gv8l") pod "f583c7a2-35a4-4338-b2c4-f069dd971290" (UID: "f583c7a2-35a4-4338-b2c4-f069dd971290"). InnerVolumeSpecName "kube-api-access-7gv8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.409938 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f583c7a2-35a4-4338-b2c4-f069dd971290" (UID: "f583c7a2-35a4-4338-b2c4-f069dd971290"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.462009 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f583c7a2-35a4-4338-b2c4-f069dd971290" (UID: "f583c7a2-35a4-4338-b2c4-f069dd971290"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.465819 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-config-data" (OuterVolumeSpecName: "config-data") pod "f583c7a2-35a4-4338-b2c4-f069dd971290" (UID: "f583c7a2-35a4-4338-b2c4-f069dd971290"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.467044 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f583c7a2-35a4-4338-b2c4-f069dd971290","Type":"ContainerDied","Data":"f3651c47d5a58a046d36d14cd8322d8efac93a98fe9ddadcc8f3e5af772220d4"} Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.467099 4563 scope.go:117] "RemoveContainer" containerID="73edaab2504f2cd89e195d68f32fe1912a83fbf60133f5d4ee50dc90b5a36b7a" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.467271 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.474306 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.474336 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.474350 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.474362 4563 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f583c7a2-35a4-4338-b2c4-f069dd971290-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.474373 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gv8l\" (UniqueName: \"kubernetes.io/projected/f583c7a2-35a4-4338-b2c4-f069dd971290-kube-api-access-7gv8l\") on node \"crc\" DevicePath \"\"" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.475141 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rgwhs" event={"ID":"8c326972-9b5f-4f8c-b71d-6811d65b31e1","Type":"ContainerStarted","Data":"0be0346bf4becd1ff51770cad5e42c8beca3aae6fea47dd4defa408bcb9dad93"} Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.493378 4563 scope.go:117] "RemoveContainer" containerID="922516add2fdedb8af7768b7e3677fe01a4367146ea6219861478b86f27023cb" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.509917 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.517234 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.518234 4563 scope.go:117] "RemoveContainer" containerID="dac9b5d69811900e3bfb1caf853a34390014ef24d9d606202438d294088696c3" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.524621 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-rgwhs" podStartSLOduration=1.634533513 podStartE2EDuration="8.524609417s" podCreationTimestamp="2025-11-24 09:19:50 +0000 UTC" firstStartedPulling="2025-11-24 09:19:51.254785607 +0000 UTC m=+968.513763054" lastFinishedPulling="2025-11-24 09:19:58.14486151 +0000 UTC m=+975.403838958" observedRunningTime="2025-11-24 09:19:58.522666763 +0000 UTC m=+975.781644210" watchObservedRunningTime="2025-11-24 09:19:58.524609417 +0000 UTC m=+975.783586864" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.539002 4563 scope.go:117] "RemoveContainer" containerID="4981168a34838ff03c867b84b3fd6379a315ed30af3944da659cb7cb34e064c3" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.552654 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:58 crc kubenswrapper[4563]: E1124 09:19:58.553133 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="ceilometer-central-agent" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.553153 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="ceilometer-central-agent" Nov 24 09:19:58 crc kubenswrapper[4563]: E1124 09:19:58.553164 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="ceilometer-notification-agent" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.553172 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="ceilometer-notification-agent" Nov 24 09:19:58 crc kubenswrapper[4563]: E1124 09:19:58.553181 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="proxy-httpd" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.553187 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="proxy-httpd" Nov 24 09:19:58 crc kubenswrapper[4563]: E1124 09:19:58.553197 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="sg-core" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.553203 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="sg-core" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.563076 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="ceilometer-central-agent" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.563128 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="proxy-httpd" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.563146 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="sg-core" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.563162 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" containerName="ceilometer-notification-agent" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.579670 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.608621 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.610522 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.646770 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.728113 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.728196 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.728214 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm8wx\" (UniqueName: \"kubernetes.io/projected/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-kube-api-access-bm8wx\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.728232 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-config-data\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.728253 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-run-httpd\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.728319 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-scripts\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.728389 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-log-httpd\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.787049 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.787096 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.817143 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.820591 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.830788 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-scripts\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.830865 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-log-httpd\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.830950 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.831002 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.831021 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm8wx\" (UniqueName: \"kubernetes.io/projected/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-kube-api-access-bm8wx\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.831042 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-config-data\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.831061 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-run-httpd\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.831393 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-log-httpd\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.831904 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-run-httpd\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.836593 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-scripts\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.839454 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.839835 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-config-data\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.847146 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.851603 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm8wx\" (UniqueName: \"kubernetes.io/projected/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-kube-api-access-bm8wx\") pod \"ceilometer-0\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " pod="openstack/ceilometer-0" Nov 24 09:19:58 crc kubenswrapper[4563]: I1124 09:19:58.957260 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:19:59 crc kubenswrapper[4563]: I1124 09:19:59.100521 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f583c7a2-35a4-4338-b2c4-f069dd971290" path="/var/lib/kubelet/pods/f583c7a2-35a4-4338-b2c4-f069dd971290/volumes" Nov 24 09:19:59 crc kubenswrapper[4563]: I1124 09:19:59.396083 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:19:59 crc kubenswrapper[4563]: I1124 09:19:59.488859 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e42cf6e-bed3-4eff-9828-b2f14b7b7981","Type":"ContainerStarted","Data":"24662b7729911f2cfb85c83bc1bbd5013d7248bc7400a10d9fb3d8252e8770ba"} Nov 24 09:19:59 crc kubenswrapper[4563]: I1124 09:19:59.489098 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 24 09:19:59 crc kubenswrapper[4563]: I1124 09:19:59.489814 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 24 09:20:00 crc kubenswrapper[4563]: I1124 09:20:00.498713 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e42cf6e-bed3-4eff-9828-b2f14b7b7981","Type":"ContainerStarted","Data":"69648a918368a95cb15dba711aae4e2c4329826abc4b484e5bdb023125852279"} Nov 24 09:20:01 crc kubenswrapper[4563]: I1124 09:20:01.275660 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 24 09:20:01 crc kubenswrapper[4563]: I1124 09:20:01.278741 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 24 09:20:01 crc kubenswrapper[4563]: I1124 09:20:01.507057 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e42cf6e-bed3-4eff-9828-b2f14b7b7981","Type":"ContainerStarted","Data":"fb197f497ffecc24dadc9ffebf1eb8ec8ef0661ea37cab2f856deb148ed7ae81"} Nov 24 09:20:03 crc kubenswrapper[4563]: I1124 09:20:03.529167 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e42cf6e-bed3-4eff-9828-b2f14b7b7981","Type":"ContainerStarted","Data":"f3edac3a5dae761c2597536e22d6bc7ab2f4caa873e6cc3cf08b00e0b05a91d0"} Nov 24 09:20:03 crc kubenswrapper[4563]: I1124 09:20:03.572359 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:20:04 crc kubenswrapper[4563]: I1124 09:20:04.543247 4563 generic.go:334] "Generic (PLEG): container finished" podID="8c326972-9b5f-4f8c-b71d-6811d65b31e1" containerID="0be0346bf4becd1ff51770cad5e42c8beca3aae6fea47dd4defa408bcb9dad93" exitCode=0 Nov 24 09:20:04 crc kubenswrapper[4563]: I1124 09:20:04.543399 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rgwhs" event={"ID":"8c326972-9b5f-4f8c-b71d-6811d65b31e1","Type":"ContainerDied","Data":"0be0346bf4becd1ff51770cad5e42c8beca3aae6fea47dd4defa408bcb9dad93"} Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.558894 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="ceilometer-central-agent" containerID="cri-o://69648a918368a95cb15dba711aae4e2c4329826abc4b484e5bdb023125852279" gracePeriod=30 Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.559687 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e42cf6e-bed3-4eff-9828-b2f14b7b7981","Type":"ContainerStarted","Data":"794b4531d740ea83c583e0e2bb054f2575ef1925edebfed17f57fd9bc8925a6b"} Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.559755 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.560107 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="proxy-httpd" containerID="cri-o://794b4531d740ea83c583e0e2bb054f2575ef1925edebfed17f57fd9bc8925a6b" gracePeriod=30 Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.560169 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="sg-core" containerID="cri-o://f3edac3a5dae761c2597536e22d6bc7ab2f4caa873e6cc3cf08b00e0b05a91d0" gracePeriod=30 Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.560208 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="ceilometer-notification-agent" containerID="cri-o://fb197f497ffecc24dadc9ffebf1eb8ec8ef0661ea37cab2f856deb148ed7ae81" gracePeriod=30 Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.582960 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.251909012 podStartE2EDuration="7.582948396s" podCreationTimestamp="2025-11-24 09:19:58 +0000 UTC" firstStartedPulling="2025-11-24 09:19:59.395533303 +0000 UTC m=+976.654510760" lastFinishedPulling="2025-11-24 09:20:04.726572698 +0000 UTC m=+981.985550144" observedRunningTime="2025-11-24 09:20:05.57933457 +0000 UTC m=+982.838312016" watchObservedRunningTime="2025-11-24 09:20:05.582948396 +0000 UTC m=+982.841925843" Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.865038 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.887261 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-scripts\") pod \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.887410 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwglj\" (UniqueName: \"kubernetes.io/projected/8c326972-9b5f-4f8c-b71d-6811d65b31e1-kube-api-access-hwglj\") pod \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.887455 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-config-data\") pod \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.887583 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-combined-ca-bundle\") pod \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\" (UID: \"8c326972-9b5f-4f8c-b71d-6811d65b31e1\") " Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.895393 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c326972-9b5f-4f8c-b71d-6811d65b31e1-kube-api-access-hwglj" (OuterVolumeSpecName: "kube-api-access-hwglj") pod "8c326972-9b5f-4f8c-b71d-6811d65b31e1" (UID: "8c326972-9b5f-4f8c-b71d-6811d65b31e1"). InnerVolumeSpecName "kube-api-access-hwglj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.895400 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-scripts" (OuterVolumeSpecName: "scripts") pod "8c326972-9b5f-4f8c-b71d-6811d65b31e1" (UID: "8c326972-9b5f-4f8c-b71d-6811d65b31e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.920793 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-config-data" (OuterVolumeSpecName: "config-data") pod "8c326972-9b5f-4f8c-b71d-6811d65b31e1" (UID: "8c326972-9b5f-4f8c-b71d-6811d65b31e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.923937 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c326972-9b5f-4f8c-b71d-6811d65b31e1" (UID: "8c326972-9b5f-4f8c-b71d-6811d65b31e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.989660 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwglj\" (UniqueName: \"kubernetes.io/projected/8c326972-9b5f-4f8c-b71d-6811d65b31e1-kube-api-access-hwglj\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.989687 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.989697 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:05 crc kubenswrapper[4563]: I1124 09:20:05.989708 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c326972-9b5f-4f8c-b71d-6811d65b31e1-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.571821 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rgwhs" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.571809 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rgwhs" event={"ID":"8c326972-9b5f-4f8c-b71d-6811d65b31e1","Type":"ContainerDied","Data":"2967add47666fe873c918e515176bc69710e5be87eff8eca34480859284ba562"} Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.572218 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2967add47666fe873c918e515176bc69710e5be87eff8eca34480859284ba562" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.575318 4563 generic.go:334] "Generic (PLEG): container finished" podID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerID="794b4531d740ea83c583e0e2bb054f2575ef1925edebfed17f57fd9bc8925a6b" exitCode=0 Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.575344 4563 generic.go:334] "Generic (PLEG): container finished" podID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerID="f3edac3a5dae761c2597536e22d6bc7ab2f4caa873e6cc3cf08b00e0b05a91d0" exitCode=2 Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.575352 4563 generic.go:334] "Generic (PLEG): container finished" podID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerID="fb197f497ffecc24dadc9ffebf1eb8ec8ef0661ea37cab2f856deb148ed7ae81" exitCode=0 Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.575373 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e42cf6e-bed3-4eff-9828-b2f14b7b7981","Type":"ContainerDied","Data":"794b4531d740ea83c583e0e2bb054f2575ef1925edebfed17f57fd9bc8925a6b"} Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.575396 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e42cf6e-bed3-4eff-9828-b2f14b7b7981","Type":"ContainerDied","Data":"f3edac3a5dae761c2597536e22d6bc7ab2f4caa873e6cc3cf08b00e0b05a91d0"} Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.575407 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e42cf6e-bed3-4eff-9828-b2f14b7b7981","Type":"ContainerDied","Data":"fb197f497ffecc24dadc9ffebf1eb8ec8ef0661ea37cab2f856deb148ed7ae81"} Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.648903 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 09:20:06 crc kubenswrapper[4563]: E1124 09:20:06.649354 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c326972-9b5f-4f8c-b71d-6811d65b31e1" containerName="nova-cell0-conductor-db-sync" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.649376 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c326972-9b5f-4f8c-b71d-6811d65b31e1" containerName="nova-cell0-conductor-db-sync" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.649633 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c326972-9b5f-4f8c-b71d-6811d65b31e1" containerName="nova-cell0-conductor-db-sync" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.650442 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.652372 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.653058 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-clr79" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.665873 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.805086 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7e5bec0-7f94-410f-9344-aaa699457924-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a7e5bec0-7f94-410f-9344-aaa699457924\") " pod="openstack/nova-cell0-conductor-0" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.805157 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlbzv\" (UniqueName: \"kubernetes.io/projected/a7e5bec0-7f94-410f-9344-aaa699457924-kube-api-access-tlbzv\") pod \"nova-cell0-conductor-0\" (UID: \"a7e5bec0-7f94-410f-9344-aaa699457924\") " pod="openstack/nova-cell0-conductor-0" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.805579 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7e5bec0-7f94-410f-9344-aaa699457924-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a7e5bec0-7f94-410f-9344-aaa699457924\") " pod="openstack/nova-cell0-conductor-0" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.907255 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlbzv\" (UniqueName: \"kubernetes.io/projected/a7e5bec0-7f94-410f-9344-aaa699457924-kube-api-access-tlbzv\") pod \"nova-cell0-conductor-0\" (UID: \"a7e5bec0-7f94-410f-9344-aaa699457924\") " pod="openstack/nova-cell0-conductor-0" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.907391 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7e5bec0-7f94-410f-9344-aaa699457924-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a7e5bec0-7f94-410f-9344-aaa699457924\") " pod="openstack/nova-cell0-conductor-0" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.907454 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7e5bec0-7f94-410f-9344-aaa699457924-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a7e5bec0-7f94-410f-9344-aaa699457924\") " pod="openstack/nova-cell0-conductor-0" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.913325 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7e5bec0-7f94-410f-9344-aaa699457924-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a7e5bec0-7f94-410f-9344-aaa699457924\") " pod="openstack/nova-cell0-conductor-0" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.914265 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7e5bec0-7f94-410f-9344-aaa699457924-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a7e5bec0-7f94-410f-9344-aaa699457924\") " pod="openstack/nova-cell0-conductor-0" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.923083 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlbzv\" (UniqueName: \"kubernetes.io/projected/a7e5bec0-7f94-410f-9344-aaa699457924-kube-api-access-tlbzv\") pod \"nova-cell0-conductor-0\" (UID: \"a7e5bec0-7f94-410f-9344-aaa699457924\") " pod="openstack/nova-cell0-conductor-0" Nov 24 09:20:06 crc kubenswrapper[4563]: I1124 09:20:06.962875 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 24 09:20:07 crc kubenswrapper[4563]: I1124 09:20:07.348134 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 24 09:20:07 crc kubenswrapper[4563]: W1124 09:20:07.355624 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7e5bec0_7f94_410f_9344_aaa699457924.slice/crio-dff3e0d056d857baaa04f990ea34e2e5d070762a3d93f47337729fa9bf1d77ee WatchSource:0}: Error finding container dff3e0d056d857baaa04f990ea34e2e5d070762a3d93f47337729fa9bf1d77ee: Status 404 returned error can't find the container with id dff3e0d056d857baaa04f990ea34e2e5d070762a3d93f47337729fa9bf1d77ee Nov 24 09:20:07 crc kubenswrapper[4563]: I1124 09:20:07.589832 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a7e5bec0-7f94-410f-9344-aaa699457924","Type":"ContainerStarted","Data":"af43f1e3cb3fb766ccc72509f66e5ca8b756b6faba18ddfb4a585a457ee8a32b"} Nov 24 09:20:07 crc kubenswrapper[4563]: I1124 09:20:07.590126 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 24 09:20:07 crc kubenswrapper[4563]: I1124 09:20:07.590138 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a7e5bec0-7f94-410f-9344-aaa699457924","Type":"ContainerStarted","Data":"dff3e0d056d857baaa04f990ea34e2e5d070762a3d93f47337729fa9bf1d77ee"} Nov 24 09:20:07 crc kubenswrapper[4563]: I1124 09:20:07.602303 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.602285564 podStartE2EDuration="1.602285564s" podCreationTimestamp="2025-11-24 09:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:20:07.601600923 +0000 UTC m=+984.860578370" watchObservedRunningTime="2025-11-24 09:20:07.602285564 +0000 UTC m=+984.861263011" Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.601485 4563 generic.go:334] "Generic (PLEG): container finished" podID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerID="69648a918368a95cb15dba711aae4e2c4329826abc4b484e5bdb023125852279" exitCode=0 Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.601665 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e42cf6e-bed3-4eff-9828-b2f14b7b7981","Type":"ContainerDied","Data":"69648a918368a95cb15dba711aae4e2c4329826abc4b484e5bdb023125852279"} Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.812810 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.850038 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-scripts\") pod \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.856130 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-scripts" (OuterVolumeSpecName: "scripts") pod "7e42cf6e-bed3-4eff-9828-b2f14b7b7981" (UID: "7e42cf6e-bed3-4eff-9828-b2f14b7b7981"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.951315 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-log-httpd\") pod \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.951360 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-config-data\") pod \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.951461 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-run-httpd\") pod \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.951543 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm8wx\" (UniqueName: \"kubernetes.io/projected/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-kube-api-access-bm8wx\") pod \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.951607 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-combined-ca-bundle\") pod \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.951715 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-sg-core-conf-yaml\") pod \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\" (UID: \"7e42cf6e-bed3-4eff-9828-b2f14b7b7981\") " Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.952164 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.952188 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7e42cf6e-bed3-4eff-9828-b2f14b7b7981" (UID: "7e42cf6e-bed3-4eff-9828-b2f14b7b7981"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.952243 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7e42cf6e-bed3-4eff-9828-b2f14b7b7981" (UID: "7e42cf6e-bed3-4eff-9828-b2f14b7b7981"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.956830 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-kube-api-access-bm8wx" (OuterVolumeSpecName: "kube-api-access-bm8wx") pod "7e42cf6e-bed3-4eff-9828-b2f14b7b7981" (UID: "7e42cf6e-bed3-4eff-9828-b2f14b7b7981"). InnerVolumeSpecName "kube-api-access-bm8wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.972842 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7e42cf6e-bed3-4eff-9828-b2f14b7b7981" (UID: "7e42cf6e-bed3-4eff-9828-b2f14b7b7981"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.987534 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:20:08 crc kubenswrapper[4563]: I1124 09:20:08.987597 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.012822 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e42cf6e-bed3-4eff-9828-b2f14b7b7981" (UID: "7e42cf6e-bed3-4eff-9828-b2f14b7b7981"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.029310 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-config-data" (OuterVolumeSpecName: "config-data") pod "7e42cf6e-bed3-4eff-9828-b2f14b7b7981" (UID: "7e42cf6e-bed3-4eff-9828-b2f14b7b7981"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.054630 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm8wx\" (UniqueName: \"kubernetes.io/projected/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-kube-api-access-bm8wx\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.054687 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.054698 4563 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.054712 4563 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.054738 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.054747 4563 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e42cf6e-bed3-4eff-9828-b2f14b7b7981-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.614035 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e42cf6e-bed3-4eff-9828-b2f14b7b7981","Type":"ContainerDied","Data":"24662b7729911f2cfb85c83bc1bbd5013d7248bc7400a10d9fb3d8252e8770ba"} Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.614108 4563 scope.go:117] "RemoveContainer" containerID="794b4531d740ea83c583e0e2bb054f2575ef1925edebfed17f57fd9bc8925a6b" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.614143 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.635676 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.639877 4563 scope.go:117] "RemoveContainer" containerID="f3edac3a5dae761c2597536e22d6bc7ab2f4caa873e6cc3cf08b00e0b05a91d0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.644785 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.671088 4563 scope.go:117] "RemoveContainer" containerID="fb197f497ffecc24dadc9ffebf1eb8ec8ef0661ea37cab2f856deb148ed7ae81" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.680677 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:20:09 crc kubenswrapper[4563]: E1124 09:20:09.681164 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="ceilometer-central-agent" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.681185 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="ceilometer-central-agent" Nov 24 09:20:09 crc kubenswrapper[4563]: E1124 09:20:09.681230 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="ceilometer-notification-agent" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.681238 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="ceilometer-notification-agent" Nov 24 09:20:09 crc kubenswrapper[4563]: E1124 09:20:09.681256 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="sg-core" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.681264 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="sg-core" Nov 24 09:20:09 crc kubenswrapper[4563]: E1124 09:20:09.681273 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="proxy-httpd" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.681281 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="proxy-httpd" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.681505 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="sg-core" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.681525 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="ceilometer-central-agent" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.681543 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="ceilometer-notification-agent" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.681555 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" containerName="proxy-httpd" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.683563 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.686572 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.686897 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.692093 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.692996 4563 scope.go:117] "RemoveContainer" containerID="69648a918368a95cb15dba711aae4e2c4329826abc4b484e5bdb023125852279" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.769181 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.769320 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-run-httpd\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.769388 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-config-data\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.769417 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-log-httpd\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.769462 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d67wf\" (UniqueName: \"kubernetes.io/projected/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-kube-api-access-d67wf\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.769575 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.769717 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-scripts\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.870913 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-scripts\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.870981 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.871018 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-run-httpd\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.871044 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-config-data\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.871064 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-log-httpd\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.871092 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d67wf\" (UniqueName: \"kubernetes.io/projected/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-kube-api-access-d67wf\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.871141 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.871914 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-log-httpd\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.871970 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-run-httpd\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.875170 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-scripts\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.877367 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-config-data\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.877751 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.882241 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:09 crc kubenswrapper[4563]: I1124 09:20:09.886256 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d67wf\" (UniqueName: \"kubernetes.io/projected/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-kube-api-access-d67wf\") pod \"ceilometer-0\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " pod="openstack/ceilometer-0" Nov 24 09:20:10 crc kubenswrapper[4563]: I1124 09:20:10.007245 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:20:10 crc kubenswrapper[4563]: I1124 09:20:10.414075 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:20:10 crc kubenswrapper[4563]: W1124 09:20:10.421010 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea1e73ba_dc1f_41fe_a8c4_91c8a5e2ea6b.slice/crio-1d133de685ad21e08c3435cbd0182b25eefe728d9286705dd31ebc5e0088fc34 WatchSource:0}: Error finding container 1d133de685ad21e08c3435cbd0182b25eefe728d9286705dd31ebc5e0088fc34: Status 404 returned error can't find the container with id 1d133de685ad21e08c3435cbd0182b25eefe728d9286705dd31ebc5e0088fc34 Nov 24 09:20:10 crc kubenswrapper[4563]: I1124 09:20:10.650170 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b","Type":"ContainerStarted","Data":"1d133de685ad21e08c3435cbd0182b25eefe728d9286705dd31ebc5e0088fc34"} Nov 24 09:20:11 crc kubenswrapper[4563]: I1124 09:20:11.067209 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e42cf6e-bed3-4eff-9828-b2f14b7b7981" path="/var/lib/kubelet/pods/7e42cf6e-bed3-4eff-9828-b2f14b7b7981/volumes" Nov 24 09:20:11 crc kubenswrapper[4563]: I1124 09:20:11.660665 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b","Type":"ContainerStarted","Data":"8a3f77239c4245694b9ea33de4825b5afd4bc159721329407c3278485cad7808"} Nov 24 09:20:12 crc kubenswrapper[4563]: I1124 09:20:12.671932 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b","Type":"ContainerStarted","Data":"6619eae2703b15e362e62ea50db8761c10b2ff961756f1432e95fd86594a5c87"} Nov 24 09:20:12 crc kubenswrapper[4563]: I1124 09:20:12.672399 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b","Type":"ContainerStarted","Data":"11abab118e1b19bf81567d54eca3d64b6079a6410d25f3727f652e05ddd2420a"} Nov 24 09:20:14 crc kubenswrapper[4563]: I1124 09:20:14.690608 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b","Type":"ContainerStarted","Data":"6cc9511aa58c5a49ef8fa255825bc26a1ae59d08f70be4112f8547340ced9d32"} Nov 24 09:20:14 crc kubenswrapper[4563]: I1124 09:20:14.691187 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 09:20:14 crc kubenswrapper[4563]: I1124 09:20:14.714982 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.394171508 podStartE2EDuration="5.714965279s" podCreationTimestamp="2025-11-24 09:20:09 +0000 UTC" firstStartedPulling="2025-11-24 09:20:10.424905077 +0000 UTC m=+987.683882515" lastFinishedPulling="2025-11-24 09:20:13.745698838 +0000 UTC m=+991.004676286" observedRunningTime="2025-11-24 09:20:14.707421512 +0000 UTC m=+991.966398959" watchObservedRunningTime="2025-11-24 09:20:14.714965279 +0000 UTC m=+991.973942716" Nov 24 09:20:16 crc kubenswrapper[4563]: I1124 09:20:16.987569 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.390200 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-fhd9d"] Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.391488 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.393574 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.393949 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.399371 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fhd9d"] Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.500486 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.502010 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.504733 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.520994 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.539338 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-config-data\") pod \"nova-cell0-cell-mapping-fhd9d\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.539375 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-scripts\") pod \"nova-cell0-cell-mapping-fhd9d\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.539484 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fhd9d\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.539565 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgwh6\" (UniqueName: \"kubernetes.io/projected/0ad9a449-c587-419b-8f09-3fa89ed6a90b-kube-api-access-wgwh6\") pod \"nova-cell0-cell-mapping-fhd9d\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.587758 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.591696 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.595252 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.631586 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.643138 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156cd303-99b2-4b2b-b149-1529e89a98ed-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " pod="openstack/nova-api-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.645266 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-config-data\") pod \"nova-cell0-cell-mapping-fhd9d\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.645322 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-scripts\") pod \"nova-cell0-cell-mapping-fhd9d\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.645620 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fhd9d\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.645757 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/156cd303-99b2-4b2b-b149-1529e89a98ed-logs\") pod \"nova-api-0\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " pod="openstack/nova-api-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.645854 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgwh6\" (UniqueName: \"kubernetes.io/projected/0ad9a449-c587-419b-8f09-3fa89ed6a90b-kube-api-access-wgwh6\") pod \"nova-cell0-cell-mapping-fhd9d\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.646045 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156cd303-99b2-4b2b-b149-1529e89a98ed-config-data\") pod \"nova-api-0\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " pod="openstack/nova-api-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.646567 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2f7j\" (UniqueName: \"kubernetes.io/projected/156cd303-99b2-4b2b-b149-1529e89a98ed-kube-api-access-f2f7j\") pod \"nova-api-0\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " pod="openstack/nova-api-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.663160 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fhd9d\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.664541 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-scripts\") pod \"nova-cell0-cell-mapping-fhd9d\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.695413 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-config-data\") pod \"nova-cell0-cell-mapping-fhd9d\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.717039 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgwh6\" (UniqueName: \"kubernetes.io/projected/0ad9a449-c587-419b-8f09-3fa89ed6a90b-kube-api-access-wgwh6\") pod \"nova-cell0-cell-mapping-fhd9d\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.721529 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.723039 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.728033 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.736364 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.751939 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.760254 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2f7j\" (UniqueName: \"kubernetes.io/projected/156cd303-99b2-4b2b-b149-1529e89a98ed-kube-api-access-f2f7j\") pod \"nova-api-0\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " pod="openstack/nova-api-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.760295 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543b029c-5742-4c86-87c5-2c0f6dee9431-config-data\") pod \"nova-scheduler-0\" (UID: \"543b029c-5742-4c86-87c5-2c0f6dee9431\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.760362 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vxlj\" (UniqueName: \"kubernetes.io/projected/543b029c-5742-4c86-87c5-2c0f6dee9431-kube-api-access-9vxlj\") pod \"nova-scheduler-0\" (UID: \"543b029c-5742-4c86-87c5-2c0f6dee9431\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.760446 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156cd303-99b2-4b2b-b149-1529e89a98ed-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " pod="openstack/nova-api-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.760564 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/156cd303-99b2-4b2b-b149-1529e89a98ed-logs\") pod \"nova-api-0\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " pod="openstack/nova-api-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.760700 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156cd303-99b2-4b2b-b149-1529e89a98ed-config-data\") pod \"nova-api-0\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " pod="openstack/nova-api-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.760759 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543b029c-5742-4c86-87c5-2c0f6dee9431-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"543b029c-5742-4c86-87c5-2c0f6dee9431\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.761156 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.761533 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/156cd303-99b2-4b2b-b149-1529e89a98ed-logs\") pod \"nova-api-0\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " pod="openstack/nova-api-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.771729 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.772321 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156cd303-99b2-4b2b-b149-1529e89a98ed-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " pod="openstack/nova-api-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.775804 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.785162 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156cd303-99b2-4b2b-b149-1529e89a98ed-config-data\") pod \"nova-api-0\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " pod="openstack/nova-api-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.791085 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2f7j\" (UniqueName: \"kubernetes.io/projected/156cd303-99b2-4b2b-b149-1529e89a98ed-kube-api-access-f2f7j\") pod \"nova-api-0\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " pod="openstack/nova-api-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.802464 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dd7c4987f-f758b"] Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.804201 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.826732 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.860648 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dd7c4987f-f758b"] Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.863043 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543b029c-5742-4c86-87c5-2c0f6dee9431-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"543b029c-5742-4c86-87c5-2c0f6dee9431\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.863097 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-config-data\") pod \"nova-metadata-0\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " pod="openstack/nova-metadata-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.863138 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543b029c-5742-4c86-87c5-2c0f6dee9431-config-data\") pod \"nova-scheduler-0\" (UID: \"543b029c-5742-4c86-87c5-2c0f6dee9431\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.863158 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f84ace12-fa17-4fc7-8bf0-771e8273eb55-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.863201 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84ace12-fa17-4fc7-8bf0-771e8273eb55-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.863224 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vxlj\" (UniqueName: \"kubernetes.io/projected/543b029c-5742-4c86-87c5-2c0f6dee9431-kube-api-access-9vxlj\") pod \"nova-scheduler-0\" (UID: \"543b029c-5742-4c86-87c5-2c0f6dee9431\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.863298 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r8xg\" (UniqueName: \"kubernetes.io/projected/f84ace12-fa17-4fc7-8bf0-771e8273eb55-kube-api-access-5r8xg\") pod \"nova-cell1-novncproxy-0\" (UID: \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.863323 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " pod="openstack/nova-metadata-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.863362 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz9cg\" (UniqueName: \"kubernetes.io/projected/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-kube-api-access-zz9cg\") pod \"nova-metadata-0\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " pod="openstack/nova-metadata-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.863407 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-logs\") pod \"nova-metadata-0\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " pod="openstack/nova-metadata-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.866398 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543b029c-5742-4c86-87c5-2c0f6dee9431-config-data\") pod \"nova-scheduler-0\" (UID: \"543b029c-5742-4c86-87c5-2c0f6dee9431\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.868028 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543b029c-5742-4c86-87c5-2c0f6dee9431-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"543b029c-5742-4c86-87c5-2c0f6dee9431\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.881524 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vxlj\" (UniqueName: \"kubernetes.io/projected/543b029c-5742-4c86-87c5-2c0f6dee9431-kube-api-access-9vxlj\") pod \"nova-scheduler-0\" (UID: \"543b029c-5742-4c86-87c5-2c0f6dee9431\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.957312 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.965797 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz9cg\" (UniqueName: \"kubernetes.io/projected/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-kube-api-access-zz9cg\") pod \"nova-metadata-0\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " pod="openstack/nova-metadata-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.965854 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-logs\") pod \"nova-metadata-0\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " pod="openstack/nova-metadata-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.965906 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-dns-svc\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.965926 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-ovsdbserver-nb\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.965961 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6l7b\" (UniqueName: \"kubernetes.io/projected/775efdde-5acb-4276-ab9d-bd4644541c9b-kube-api-access-z6l7b\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.966000 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-dns-swift-storage-0\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.966013 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-ovsdbserver-sb\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.966035 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-config-data\") pod \"nova-metadata-0\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " pod="openstack/nova-metadata-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.966056 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-config\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.966073 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f84ace12-fa17-4fc7-8bf0-771e8273eb55-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.966098 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84ace12-fa17-4fc7-8bf0-771e8273eb55-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.966131 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r8xg\" (UniqueName: \"kubernetes.io/projected/f84ace12-fa17-4fc7-8bf0-771e8273eb55-kube-api-access-5r8xg\") pod \"nova-cell1-novncproxy-0\" (UID: \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.966147 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " pod="openstack/nova-metadata-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.967190 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-logs\") pod \"nova-metadata-0\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " pod="openstack/nova-metadata-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.970272 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " pod="openstack/nova-metadata-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.972374 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84ace12-fa17-4fc7-8bf0-771e8273eb55-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.972954 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-config-data\") pod \"nova-metadata-0\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " pod="openstack/nova-metadata-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.977080 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f84ace12-fa17-4fc7-8bf0-771e8273eb55-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.988127 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz9cg\" (UniqueName: \"kubernetes.io/projected/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-kube-api-access-zz9cg\") pod \"nova-metadata-0\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " pod="openstack/nova-metadata-0" Nov 24 09:20:17 crc kubenswrapper[4563]: I1124 09:20:17.990083 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r8xg\" (UniqueName: \"kubernetes.io/projected/f84ace12-fa17-4fc7-8bf0-771e8273eb55-kube-api-access-5r8xg\") pod \"nova-cell1-novncproxy-0\" (UID: \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.011363 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.068036 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-dns-svc\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.068082 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-ovsdbserver-nb\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.068121 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6l7b\" (UniqueName: \"kubernetes.io/projected/775efdde-5acb-4276-ab9d-bd4644541c9b-kube-api-access-z6l7b\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.068159 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-dns-swift-storage-0\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.068176 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-ovsdbserver-sb\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.068200 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-config\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.069220 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-config\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.070871 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-dns-svc\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.071074 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-ovsdbserver-sb\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.071128 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-ovsdbserver-nb\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.072897 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-dns-swift-storage-0\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.087595 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6l7b\" (UniqueName: \"kubernetes.io/projected/775efdde-5acb-4276-ab9d-bd4644541c9b-kube-api-access-z6l7b\") pod \"dnsmasq-dns-5dd7c4987f-f758b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.138234 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.144002 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.157132 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.341651 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.359155 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:20:18 crc kubenswrapper[4563]: W1124 09:20:18.362618 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod543b029c_5742_4c86_87c5_2c0f6dee9431.slice/crio-7ead6355a31689e297d874dc6d568c63c15decd9e5bef9bce8c25974cb0be577 WatchSource:0}: Error finding container 7ead6355a31689e297d874dc6d568c63c15decd9e5bef9bce8c25974cb0be577: Status 404 returned error can't find the container with id 7ead6355a31689e297d874dc6d568c63c15decd9e5bef9bce8c25974cb0be577 Nov 24 09:20:18 crc kubenswrapper[4563]: W1124 09:20:18.391353 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod156cd303_99b2_4b2b_b149_1529e89a98ed.slice/crio-61ff65de1298b1169980a5350f69d4c503735f0e34a2a510d1d2dd8e76a9bf3b WatchSource:0}: Error finding container 61ff65de1298b1169980a5350f69d4c503735f0e34a2a510d1d2dd8e76a9bf3b: Status 404 returned error can't find the container with id 61ff65de1298b1169980a5350f69d4c503735f0e34a2a510d1d2dd8e76a9bf3b Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.508430 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c6q5k"] Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.509886 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.517664 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.517927 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.523273 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c6q5k"] Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.664913 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fhd9d"] Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.685417 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-config-data\") pod \"nova-cell1-conductor-db-sync-c6q5k\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.685886 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bjzp\" (UniqueName: \"kubernetes.io/projected/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-kube-api-access-8bjzp\") pod \"nova-cell1-conductor-db-sync-c6q5k\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.685960 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-scripts\") pod \"nova-cell1-conductor-db-sync-c6q5k\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.685977 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c6q5k\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.739389 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fhd9d" event={"ID":"0ad9a449-c587-419b-8f09-3fa89ed6a90b","Type":"ContainerStarted","Data":"290093a30ed6218bcfbd0b8105cdc22e0332d269be13551c824bc844b9328bc4"} Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.741308 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"156cd303-99b2-4b2b-b149-1529e89a98ed","Type":"ContainerStarted","Data":"61ff65de1298b1169980a5350f69d4c503735f0e34a2a510d1d2dd8e76a9bf3b"} Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.742375 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"543b029c-5742-4c86-87c5-2c0f6dee9431","Type":"ContainerStarted","Data":"7ead6355a31689e297d874dc6d568c63c15decd9e5bef9bce8c25974cb0be577"} Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.786874 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bjzp\" (UniqueName: \"kubernetes.io/projected/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-kube-api-access-8bjzp\") pod \"nova-cell1-conductor-db-sync-c6q5k\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.786965 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-scripts\") pod \"nova-cell1-conductor-db-sync-c6q5k\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.789692 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c6q5k\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.789769 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-config-data\") pod \"nova-cell1-conductor-db-sync-c6q5k\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.798159 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-scripts\") pod \"nova-cell1-conductor-db-sync-c6q5k\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.801862 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c6q5k\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.806487 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.806780 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bjzp\" (UniqueName: \"kubernetes.io/projected/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-kube-api-access-8bjzp\") pod \"nova-cell1-conductor-db-sync-c6q5k\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.823203 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dd7c4987f-f758b"] Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.826003 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-config-data\") pod \"nova-cell1-conductor-db-sync-c6q5k\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.838238 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:18 crc kubenswrapper[4563]: I1124 09:20:18.911781 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:19 crc kubenswrapper[4563]: I1124 09:20:19.305625 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c6q5k"] Nov 24 09:20:19 crc kubenswrapper[4563]: W1124 09:20:19.313525 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbebc6ce_2c30_4daa_8e0f_87f6ef9a5355.slice/crio-08825eafe4e4b38d264d11bdfd952827dfbd19d015aa3366066907df117cf8a0 WatchSource:0}: Error finding container 08825eafe4e4b38d264d11bdfd952827dfbd19d015aa3366066907df117cf8a0: Status 404 returned error can't find the container with id 08825eafe4e4b38d264d11bdfd952827dfbd19d015aa3366066907df117cf8a0 Nov 24 09:20:19 crc kubenswrapper[4563]: I1124 09:20:19.751464 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f84ace12-fa17-4fc7-8bf0-771e8273eb55","Type":"ContainerStarted","Data":"3892549bbf939ad51b847b69ba38192a6655def6832c2b14ebfd24b405d929c7"} Nov 24 09:20:19 crc kubenswrapper[4563]: I1124 09:20:19.755034 4563 generic.go:334] "Generic (PLEG): container finished" podID="775efdde-5acb-4276-ab9d-bd4644541c9b" containerID="77403049f76b0bab1ca45388ca29a86c008855ecaac23a298ad8812919854023" exitCode=0 Nov 24 09:20:19 crc kubenswrapper[4563]: I1124 09:20:19.755083 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" event={"ID":"775efdde-5acb-4276-ab9d-bd4644541c9b","Type":"ContainerDied","Data":"77403049f76b0bab1ca45388ca29a86c008855ecaac23a298ad8812919854023"} Nov 24 09:20:19 crc kubenswrapper[4563]: I1124 09:20:19.755103 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" event={"ID":"775efdde-5acb-4276-ab9d-bd4644541c9b","Type":"ContainerStarted","Data":"f2676ef731492b17eec6d39c5d8f59a39de58d23a5fead70323b3088b500d00b"} Nov 24 09:20:19 crc kubenswrapper[4563]: I1124 09:20:19.764431 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3","Type":"ContainerStarted","Data":"f71b24d04de89d380479ac0023c5ee17f13349b90700af6f507f44496e87a9cf"} Nov 24 09:20:19 crc kubenswrapper[4563]: I1124 09:20:19.779018 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c6q5k" event={"ID":"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355","Type":"ContainerStarted","Data":"6d437b26233b7501cdb08b51f1fac6eba877d7312f963513539d4a7684268d44"} Nov 24 09:20:19 crc kubenswrapper[4563]: I1124 09:20:19.779066 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c6q5k" event={"ID":"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355","Type":"ContainerStarted","Data":"08825eafe4e4b38d264d11bdfd952827dfbd19d015aa3366066907df117cf8a0"} Nov 24 09:20:19 crc kubenswrapper[4563]: I1124 09:20:19.782613 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fhd9d" event={"ID":"0ad9a449-c587-419b-8f09-3fa89ed6a90b","Type":"ContainerStarted","Data":"5a9ce07a645a3ee4fed06c306271eb75773a5e5411acca9077abbe25ed62eb05"} Nov 24 09:20:19 crc kubenswrapper[4563]: I1124 09:20:19.836036 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-c6q5k" podStartSLOduration=1.836011595 podStartE2EDuration="1.836011595s" podCreationTimestamp="2025-11-24 09:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:20:19.800347213 +0000 UTC m=+997.059324660" watchObservedRunningTime="2025-11-24 09:20:19.836011595 +0000 UTC m=+997.094989173" Nov 24 09:20:19 crc kubenswrapper[4563]: I1124 09:20:19.844113 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-fhd9d" podStartSLOduration=2.844097505 podStartE2EDuration="2.844097505s" podCreationTimestamp="2025-11-24 09:20:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:20:19.825909474 +0000 UTC m=+997.084886910" watchObservedRunningTime="2025-11-24 09:20:19.844097505 +0000 UTC m=+997.103074953" Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.234053 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.304830 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.800513 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f84ace12-fa17-4fc7-8bf0-771e8273eb55","Type":"ContainerStarted","Data":"daf4829fa7ff43fc586194d9c583f00284566907b2991030a438f4e5c5c3ebaf"} Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.800699 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f84ace12-fa17-4fc7-8bf0-771e8273eb55" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://daf4829fa7ff43fc586194d9c583f00284566907b2991030a438f4e5c5c3ebaf" gracePeriod=30 Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.802810 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"156cd303-99b2-4b2b-b149-1529e89a98ed","Type":"ContainerStarted","Data":"96237b7b48247229d5b773a81bc65af32178722ae26cb0431156e33c7eaebb09"} Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.802834 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"156cd303-99b2-4b2b-b149-1529e89a98ed","Type":"ContainerStarted","Data":"3bbe161e57a24dce5944830853213c9ee9c5a7ec434a36e7ca752161bec6a1e4"} Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.805890 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" event={"ID":"775efdde-5acb-4276-ab9d-bd4644541c9b","Type":"ContainerStarted","Data":"fed3a1794add98608d53791e138de31c21dff80d920b1a07e1a6b7624c1047b9"} Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.805984 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.808390 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" containerName="nova-metadata-log" containerID="cri-o://972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660" gracePeriod=30 Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.808382 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3","Type":"ContainerStarted","Data":"f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab"} Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.808425 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" containerName="nova-metadata-metadata" containerID="cri-o://f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab" gracePeriod=30 Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.808526 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3","Type":"ContainerStarted","Data":"972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660"} Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.811392 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"543b029c-5742-4c86-87c5-2c0f6dee9431","Type":"ContainerStarted","Data":"52d586be505a6d6205c69f838d8b9a0f5b9a32a4e239ff61aa62d9482e6447b8"} Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.825029 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.67795 podStartE2EDuration="4.825012079s" podCreationTimestamp="2025-11-24 09:20:17 +0000 UTC" firstStartedPulling="2025-11-24 09:20:18.82436159 +0000 UTC m=+996.083339036" lastFinishedPulling="2025-11-24 09:20:20.971423667 +0000 UTC m=+998.230401115" observedRunningTime="2025-11-24 09:20:21.816801594 +0000 UTC m=+999.075779041" watchObservedRunningTime="2025-11-24 09:20:21.825012079 +0000 UTC m=+999.083989527" Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.834573 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.242397697 podStartE2EDuration="4.83456182s" podCreationTimestamp="2025-11-24 09:20:17 +0000 UTC" firstStartedPulling="2025-11-24 09:20:18.365610103 +0000 UTC m=+995.624587549" lastFinishedPulling="2025-11-24 09:20:20.957774225 +0000 UTC m=+998.216751672" observedRunningTime="2025-11-24 09:20:21.830136053 +0000 UTC m=+999.089113500" watchObservedRunningTime="2025-11-24 09:20:21.83456182 +0000 UTC m=+999.093539267" Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.847964 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.749213866 podStartE2EDuration="4.847950221s" podCreationTimestamp="2025-11-24 09:20:17 +0000 UTC" firstStartedPulling="2025-11-24 09:20:18.927882621 +0000 UTC m=+996.186860068" lastFinishedPulling="2025-11-24 09:20:21.026618977 +0000 UTC m=+998.285596423" observedRunningTime="2025-11-24 09:20:21.845652266 +0000 UTC m=+999.104629713" watchObservedRunningTime="2025-11-24 09:20:21.847950221 +0000 UTC m=+999.106927669" Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.871151 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" podStartSLOduration=4.871129818 podStartE2EDuration="4.871129818s" podCreationTimestamp="2025-11-24 09:20:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:20:21.86202726 +0000 UTC m=+999.121004707" watchObservedRunningTime="2025-11-24 09:20:21.871129818 +0000 UTC m=+999.130107265" Nov 24 09:20:21 crc kubenswrapper[4563]: I1124 09:20:21.887193 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.305897908 podStartE2EDuration="4.887175209s" podCreationTimestamp="2025-11-24 09:20:17 +0000 UTC" firstStartedPulling="2025-11-24 09:20:18.393140213 +0000 UTC m=+995.652117660" lastFinishedPulling="2025-11-24 09:20:20.974417514 +0000 UTC m=+998.233394961" observedRunningTime="2025-11-24 09:20:21.882250752 +0000 UTC m=+999.141228199" watchObservedRunningTime="2025-11-24 09:20:21.887175209 +0000 UTC m=+999.146152656" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.455702 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.585760 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-config-data\") pod \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.585887 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-combined-ca-bundle\") pod \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.585938 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz9cg\" (UniqueName: \"kubernetes.io/projected/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-kube-api-access-zz9cg\") pod \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.586285 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-logs\") pod \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\" (UID: \"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3\") " Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.587062 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-logs" (OuterVolumeSpecName: "logs") pod "a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" (UID: "a3d8046c-ee32-4f8a-acbf-c311fa4d83c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.587819 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.595775 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-kube-api-access-zz9cg" (OuterVolumeSpecName: "kube-api-access-zz9cg") pod "a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" (UID: "a3d8046c-ee32-4f8a-acbf-c311fa4d83c3"). InnerVolumeSpecName "kube-api-access-zz9cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.618807 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" (UID: "a3d8046c-ee32-4f8a-acbf-c311fa4d83c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.627859 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-config-data" (OuterVolumeSpecName: "config-data") pod "a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" (UID: "a3d8046c-ee32-4f8a-acbf-c311fa4d83c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.689846 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.689878 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.689890 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz9cg\" (UniqueName: \"kubernetes.io/projected/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3-kube-api-access-zz9cg\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.823001 4563 generic.go:334] "Generic (PLEG): container finished" podID="a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" containerID="f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab" exitCode=0 Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.823035 4563 generic.go:334] "Generic (PLEG): container finished" podID="a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" containerID="972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660" exitCode=143 Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.823912 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.831209 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3","Type":"ContainerDied","Data":"f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab"} Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.831242 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3","Type":"ContainerDied","Data":"972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660"} Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.831255 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3d8046c-ee32-4f8a-acbf-c311fa4d83c3","Type":"ContainerDied","Data":"f71b24d04de89d380479ac0023c5ee17f13349b90700af6f507f44496e87a9cf"} Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.831272 4563 scope.go:117] "RemoveContainer" containerID="f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.852993 4563 scope.go:117] "RemoveContainer" containerID="972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.873145 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.879916 4563 scope.go:117] "RemoveContainer" containerID="f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.884331 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:22 crc kubenswrapper[4563]: E1124 09:20:22.888165 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab\": container with ID starting with f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab not found: ID does not exist" containerID="f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.888221 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab"} err="failed to get container status \"f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab\": rpc error: code = NotFound desc = could not find container \"f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab\": container with ID starting with f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab not found: ID does not exist" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.888251 4563 scope.go:117] "RemoveContainer" containerID="972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660" Nov 24 09:20:22 crc kubenswrapper[4563]: E1124 09:20:22.888767 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660\": container with ID starting with 972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660 not found: ID does not exist" containerID="972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.888789 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660"} err="failed to get container status \"972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660\": rpc error: code = NotFound desc = could not find container \"972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660\": container with ID starting with 972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660 not found: ID does not exist" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.888807 4563 scope.go:117] "RemoveContainer" containerID="f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.889006 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab"} err="failed to get container status \"f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab\": rpc error: code = NotFound desc = could not find container \"f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab\": container with ID starting with f16cbcfdd4d2f6b643a3c0bf9dfb1ca184dbb7b80e54169bed4d80b8aa36e0ab not found: ID does not exist" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.889026 4563 scope.go:117] "RemoveContainer" containerID="972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.889189 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660"} err="failed to get container status \"972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660\": rpc error: code = NotFound desc = could not find container \"972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660\": container with ID starting with 972ed429e1a8e3ee6d09b9347573ebecd2530edf07254b5c3a808cc6bc3d2660 not found: ID does not exist" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.893308 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:22 crc kubenswrapper[4563]: E1124 09:20:22.896095 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" containerName="nova-metadata-log" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.896119 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" containerName="nova-metadata-log" Nov 24 09:20:22 crc kubenswrapper[4563]: E1124 09:20:22.896142 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" containerName="nova-metadata-metadata" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.896149 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" containerName="nova-metadata-metadata" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.896353 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" containerName="nova-metadata-metadata" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.896372 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" containerName="nova-metadata-log" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.897425 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.899174 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.900163 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.905327 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.958382 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.995412 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r68sv\" (UniqueName: \"kubernetes.io/projected/84697c71-a492-4dde-b02e-496673c76d98-kube-api-access-r68sv\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.995627 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.995730 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.996549 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-config-data\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:22 crc kubenswrapper[4563]: I1124 09:20:22.996674 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84697c71-a492-4dde-b02e-496673c76d98-logs\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.066718 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3d8046c-ee32-4f8a-acbf-c311fa4d83c3" path="/var/lib/kubelet/pods/a3d8046c-ee32-4f8a-acbf-c311fa4d83c3/volumes" Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.099079 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r68sv\" (UniqueName: \"kubernetes.io/projected/84697c71-a492-4dde-b02e-496673c76d98-kube-api-access-r68sv\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.099138 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.099165 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.099241 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-config-data\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.099274 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84697c71-a492-4dde-b02e-496673c76d98-logs\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.099601 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84697c71-a492-4dde-b02e-496673c76d98-logs\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.102370 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.103202 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-config-data\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.104204 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.111702 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r68sv\" (UniqueName: \"kubernetes.io/projected/84697c71-a492-4dde-b02e-496673c76d98-kube-api-access-r68sv\") pod \"nova-metadata-0\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " pod="openstack/nova-metadata-0" Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.139312 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.218877 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.616090 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.832327 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84697c71-a492-4dde-b02e-496673c76d98","Type":"ContainerStarted","Data":"78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e"} Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.832541 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84697c71-a492-4dde-b02e-496673c76d98","Type":"ContainerStarted","Data":"2317e66d8116fecce05618f378abb42adaeae602c37cc002562ba1604a936290"} Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.834598 4563 generic.go:334] "Generic (PLEG): container finished" podID="fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355" containerID="6d437b26233b7501cdb08b51f1fac6eba877d7312f963513539d4a7684268d44" exitCode=0 Nov 24 09:20:23 crc kubenswrapper[4563]: I1124 09:20:23.835414 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c6q5k" event={"ID":"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355","Type":"ContainerDied","Data":"6d437b26233b7501cdb08b51f1fac6eba877d7312f963513539d4a7684268d44"} Nov 24 09:20:24 crc kubenswrapper[4563]: I1124 09:20:24.847609 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84697c71-a492-4dde-b02e-496673c76d98","Type":"ContainerStarted","Data":"6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571"} Nov 24 09:20:24 crc kubenswrapper[4563]: I1124 09:20:24.863604 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.863591855 podStartE2EDuration="2.863591855s" podCreationTimestamp="2025-11-24 09:20:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:20:24.862010722 +0000 UTC m=+1002.120988169" watchObservedRunningTime="2025-11-24 09:20:24.863591855 +0000 UTC m=+1002.122569302" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.179738 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.239207 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-combined-ca-bundle\") pod \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.239356 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-config-data\") pod \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.239419 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-scripts\") pod \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.239564 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bjzp\" (UniqueName: \"kubernetes.io/projected/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-kube-api-access-8bjzp\") pod \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\" (UID: \"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355\") " Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.243459 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-scripts" (OuterVolumeSpecName: "scripts") pod "fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355" (UID: "fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.244992 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-kube-api-access-8bjzp" (OuterVolumeSpecName: "kube-api-access-8bjzp") pod "fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355" (UID: "fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355"). InnerVolumeSpecName "kube-api-access-8bjzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.265044 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355" (UID: "fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.265143 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-config-data" (OuterVolumeSpecName: "config-data") pod "fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355" (UID: "fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.342319 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bjzp\" (UniqueName: \"kubernetes.io/projected/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-kube-api-access-8bjzp\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.342346 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.342356 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.342365 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.861293 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c6q5k" event={"ID":"fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355","Type":"ContainerDied","Data":"08825eafe4e4b38d264d11bdfd952827dfbd19d015aa3366066907df117cf8a0"} Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.861549 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08825eafe4e4b38d264d11bdfd952827dfbd19d015aa3366066907df117cf8a0" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.861326 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c6q5k" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.863304 4563 generic.go:334] "Generic (PLEG): container finished" podID="0ad9a449-c587-419b-8f09-3fa89ed6a90b" containerID="5a9ce07a645a3ee4fed06c306271eb75773a5e5411acca9077abbe25ed62eb05" exitCode=0 Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.863600 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fhd9d" event={"ID":"0ad9a449-c587-419b-8f09-3fa89ed6a90b","Type":"ContainerDied","Data":"5a9ce07a645a3ee4fed06c306271eb75773a5e5411acca9077abbe25ed62eb05"} Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.923504 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 09:20:25 crc kubenswrapper[4563]: E1124 09:20:25.923981 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355" containerName="nova-cell1-conductor-db-sync" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.924003 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355" containerName="nova-cell1-conductor-db-sync" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.924208 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355" containerName="nova-cell1-conductor-db-sync" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.924923 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.928720 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 24 09:20:25 crc kubenswrapper[4563]: I1124 09:20:25.935615 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 09:20:26 crc kubenswrapper[4563]: I1124 09:20:26.056978 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f4e9e2-2f29-4076-a9e3-8513bfd1e07e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"89f4e9e2-2f29-4076-a9e3-8513bfd1e07e\") " pod="openstack/nova-cell1-conductor-0" Nov 24 09:20:26 crc kubenswrapper[4563]: I1124 09:20:26.057062 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f4e9e2-2f29-4076-a9e3-8513bfd1e07e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"89f4e9e2-2f29-4076-a9e3-8513bfd1e07e\") " pod="openstack/nova-cell1-conductor-0" Nov 24 09:20:26 crc kubenswrapper[4563]: I1124 09:20:26.057268 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7stmv\" (UniqueName: \"kubernetes.io/projected/89f4e9e2-2f29-4076-a9e3-8513bfd1e07e-kube-api-access-7stmv\") pod \"nova-cell1-conductor-0\" (UID: \"89f4e9e2-2f29-4076-a9e3-8513bfd1e07e\") " pod="openstack/nova-cell1-conductor-0" Nov 24 09:20:26 crc kubenswrapper[4563]: I1124 09:20:26.159660 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7stmv\" (UniqueName: \"kubernetes.io/projected/89f4e9e2-2f29-4076-a9e3-8513bfd1e07e-kube-api-access-7stmv\") pod \"nova-cell1-conductor-0\" (UID: \"89f4e9e2-2f29-4076-a9e3-8513bfd1e07e\") " pod="openstack/nova-cell1-conductor-0" Nov 24 09:20:26 crc kubenswrapper[4563]: I1124 09:20:26.159834 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f4e9e2-2f29-4076-a9e3-8513bfd1e07e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"89f4e9e2-2f29-4076-a9e3-8513bfd1e07e\") " pod="openstack/nova-cell1-conductor-0" Nov 24 09:20:26 crc kubenswrapper[4563]: I1124 09:20:26.159872 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f4e9e2-2f29-4076-a9e3-8513bfd1e07e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"89f4e9e2-2f29-4076-a9e3-8513bfd1e07e\") " pod="openstack/nova-cell1-conductor-0" Nov 24 09:20:26 crc kubenswrapper[4563]: I1124 09:20:26.166236 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f4e9e2-2f29-4076-a9e3-8513bfd1e07e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"89f4e9e2-2f29-4076-a9e3-8513bfd1e07e\") " pod="openstack/nova-cell1-conductor-0" Nov 24 09:20:26 crc kubenswrapper[4563]: I1124 09:20:26.166255 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f4e9e2-2f29-4076-a9e3-8513bfd1e07e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"89f4e9e2-2f29-4076-a9e3-8513bfd1e07e\") " pod="openstack/nova-cell1-conductor-0" Nov 24 09:20:26 crc kubenswrapper[4563]: I1124 09:20:26.177995 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7stmv\" (UniqueName: \"kubernetes.io/projected/89f4e9e2-2f29-4076-a9e3-8513bfd1e07e-kube-api-access-7stmv\") pod \"nova-cell1-conductor-0\" (UID: \"89f4e9e2-2f29-4076-a9e3-8513bfd1e07e\") " pod="openstack/nova-cell1-conductor-0" Nov 24 09:20:26 crc kubenswrapper[4563]: I1124 09:20:26.241966 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 24 09:20:26 crc kubenswrapper[4563]: W1124 09:20:26.630163 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89f4e9e2_2f29_4076_a9e3_8513bfd1e07e.slice/crio-cb9872d9806d2f23f72ec6fafb558475cc4cdc2e7bd90ec1af5d0347a75bb926 WatchSource:0}: Error finding container cb9872d9806d2f23f72ec6fafb558475cc4cdc2e7bd90ec1af5d0347a75bb926: Status 404 returned error can't find the container with id cb9872d9806d2f23f72ec6fafb558475cc4cdc2e7bd90ec1af5d0347a75bb926 Nov 24 09:20:26 crc kubenswrapper[4563]: I1124 09:20:26.630687 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 24 09:20:26 crc kubenswrapper[4563]: I1124 09:20:26.875831 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"89f4e9e2-2f29-4076-a9e3-8513bfd1e07e","Type":"ContainerStarted","Data":"b3c5c9aab30471ad89a0eaf4ddf2748d8c363859b51fa44cc8e28454498d242d"} Nov 24 09:20:26 crc kubenswrapper[4563]: I1124 09:20:26.875908 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"89f4e9e2-2f29-4076-a9e3-8513bfd1e07e","Type":"ContainerStarted","Data":"cb9872d9806d2f23f72ec6fafb558475cc4cdc2e7bd90ec1af5d0347a75bb926"} Nov 24 09:20:26 crc kubenswrapper[4563]: I1124 09:20:26.895959 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.895940763 podStartE2EDuration="1.895940763s" podCreationTimestamp="2025-11-24 09:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:20:26.892988435 +0000 UTC m=+1004.151965882" watchObservedRunningTime="2025-11-24 09:20:26.895940763 +0000 UTC m=+1004.154918210" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.133956 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.284945 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-config-data\") pod \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.285055 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-combined-ca-bundle\") pod \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.285178 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-scripts\") pod \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.285204 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgwh6\" (UniqueName: \"kubernetes.io/projected/0ad9a449-c587-419b-8f09-3fa89ed6a90b-kube-api-access-wgwh6\") pod \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\" (UID: \"0ad9a449-c587-419b-8f09-3fa89ed6a90b\") " Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.290700 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ad9a449-c587-419b-8f09-3fa89ed6a90b-kube-api-access-wgwh6" (OuterVolumeSpecName: "kube-api-access-wgwh6") pod "0ad9a449-c587-419b-8f09-3fa89ed6a90b" (UID: "0ad9a449-c587-419b-8f09-3fa89ed6a90b"). InnerVolumeSpecName "kube-api-access-wgwh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.298489 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-scripts" (OuterVolumeSpecName: "scripts") pod "0ad9a449-c587-419b-8f09-3fa89ed6a90b" (UID: "0ad9a449-c587-419b-8f09-3fa89ed6a90b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.308693 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ad9a449-c587-419b-8f09-3fa89ed6a90b" (UID: "0ad9a449-c587-419b-8f09-3fa89ed6a90b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.309457 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-config-data" (OuterVolumeSpecName: "config-data") pod "0ad9a449-c587-419b-8f09-3fa89ed6a90b" (UID: "0ad9a449-c587-419b-8f09-3fa89ed6a90b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.387918 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.387946 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.387957 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ad9a449-c587-419b-8f09-3fa89ed6a90b-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.387966 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgwh6\" (UniqueName: \"kubernetes.io/projected/0ad9a449-c587-419b-8f09-3fa89ed6a90b-kube-api-access-wgwh6\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.827808 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.827867 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.887138 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fhd9d" event={"ID":"0ad9a449-c587-419b-8f09-3fa89ed6a90b","Type":"ContainerDied","Data":"290093a30ed6218bcfbd0b8105cdc22e0332d269be13551c824bc844b9328bc4"} Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.887168 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fhd9d" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.887185 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="290093a30ed6218bcfbd0b8105cdc22e0332d269be13551c824bc844b9328bc4" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.887295 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.957707 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 24 09:20:27 crc kubenswrapper[4563]: I1124 09:20:27.981959 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.057117 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.057269 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="156cd303-99b2-4b2b-b149-1529e89a98ed" containerName="nova-api-log" containerID="cri-o://3bbe161e57a24dce5944830853213c9ee9c5a7ec434a36e7ca752161bec6a1e4" gracePeriod=30 Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.057707 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="156cd303-99b2-4b2b-b149-1529e89a98ed" containerName="nova-api-api" containerID="cri-o://96237b7b48247229d5b773a81bc65af32178722ae26cb0431156e33c7eaebb09" gracePeriod=30 Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.068880 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="156cd303-99b2-4b2b-b149-1529e89a98ed" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": EOF" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.073051 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.076214 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="156cd303-99b2-4b2b-b149-1529e89a98ed" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": EOF" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.095789 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.096052 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="84697c71-a492-4dde-b02e-496673c76d98" containerName="nova-metadata-log" containerID="cri-o://78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e" gracePeriod=30 Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.096515 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="84697c71-a492-4dde-b02e-496673c76d98" containerName="nova-metadata-metadata" containerID="cri-o://6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571" gracePeriod=30 Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.158832 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.215147 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-797bbc649-f7c7l"] Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.215381 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-797bbc649-f7c7l" podUID="f99d33d2-61e7-452c-9032-f2be6301ac6d" containerName="dnsmasq-dns" containerID="cri-o://cb61e886c7e087b2b2709b91942be4dc5c8db8b656ed0610f97ff184e29ff520" gracePeriod=10 Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.220074 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.220113 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.684839 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.692040 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.825864 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r68sv\" (UniqueName: \"kubernetes.io/projected/84697c71-a492-4dde-b02e-496673c76d98-kube-api-access-r68sv\") pod \"84697c71-a492-4dde-b02e-496673c76d98\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.825946 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-config\") pod \"f99d33d2-61e7-452c-9032-f2be6301ac6d\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.826021 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-config-data\") pod \"84697c71-a492-4dde-b02e-496673c76d98\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.826074 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84697c71-a492-4dde-b02e-496673c76d98-logs\") pod \"84697c71-a492-4dde-b02e-496673c76d98\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.826193 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-ovsdbserver-nb\") pod \"f99d33d2-61e7-452c-9032-f2be6301ac6d\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.826227 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-dns-swift-storage-0\") pod \"f99d33d2-61e7-452c-9032-f2be6301ac6d\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.826272 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-nova-metadata-tls-certs\") pod \"84697c71-a492-4dde-b02e-496673c76d98\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.826305 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-dns-svc\") pod \"f99d33d2-61e7-452c-9032-f2be6301ac6d\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.826343 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-ovsdbserver-sb\") pod \"f99d33d2-61e7-452c-9032-f2be6301ac6d\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.826410 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84697c71-a492-4dde-b02e-496673c76d98-logs" (OuterVolumeSpecName: "logs") pod "84697c71-a492-4dde-b02e-496673c76d98" (UID: "84697c71-a492-4dde-b02e-496673c76d98"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.826426 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-combined-ca-bundle\") pod \"84697c71-a492-4dde-b02e-496673c76d98\" (UID: \"84697c71-a492-4dde-b02e-496673c76d98\") " Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.826624 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxsj7\" (UniqueName: \"kubernetes.io/projected/f99d33d2-61e7-452c-9032-f2be6301ac6d-kube-api-access-kxsj7\") pod \"f99d33d2-61e7-452c-9032-f2be6301ac6d\" (UID: \"f99d33d2-61e7-452c-9032-f2be6301ac6d\") " Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.827570 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84697c71-a492-4dde-b02e-496673c76d98-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.831959 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99d33d2-61e7-452c-9032-f2be6301ac6d-kube-api-access-kxsj7" (OuterVolumeSpecName: "kube-api-access-kxsj7") pod "f99d33d2-61e7-452c-9032-f2be6301ac6d" (UID: "f99d33d2-61e7-452c-9032-f2be6301ac6d"). InnerVolumeSpecName "kube-api-access-kxsj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.835756 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84697c71-a492-4dde-b02e-496673c76d98-kube-api-access-r68sv" (OuterVolumeSpecName: "kube-api-access-r68sv") pod "84697c71-a492-4dde-b02e-496673c76d98" (UID: "84697c71-a492-4dde-b02e-496673c76d98"). InnerVolumeSpecName "kube-api-access-r68sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.853153 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84697c71-a492-4dde-b02e-496673c76d98" (UID: "84697c71-a492-4dde-b02e-496673c76d98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.861344 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-config-data" (OuterVolumeSpecName: "config-data") pod "84697c71-a492-4dde-b02e-496673c76d98" (UID: "84697c71-a492-4dde-b02e-496673c76d98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.870602 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f99d33d2-61e7-452c-9032-f2be6301ac6d" (UID: "f99d33d2-61e7-452c-9032-f2be6301ac6d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.871321 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f99d33d2-61e7-452c-9032-f2be6301ac6d" (UID: "f99d33d2-61e7-452c-9032-f2be6301ac6d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.875767 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f99d33d2-61e7-452c-9032-f2be6301ac6d" (UID: "f99d33d2-61e7-452c-9032-f2be6301ac6d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.877590 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-config" (OuterVolumeSpecName: "config") pod "f99d33d2-61e7-452c-9032-f2be6301ac6d" (UID: "f99d33d2-61e7-452c-9032-f2be6301ac6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.878927 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "84697c71-a492-4dde-b02e-496673c76d98" (UID: "84697c71-a492-4dde-b02e-496673c76d98"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.889111 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f99d33d2-61e7-452c-9032-f2be6301ac6d" (UID: "f99d33d2-61e7-452c-9032-f2be6301ac6d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.898208 4563 generic.go:334] "Generic (PLEG): container finished" podID="f99d33d2-61e7-452c-9032-f2be6301ac6d" containerID="cb61e886c7e087b2b2709b91942be4dc5c8db8b656ed0610f97ff184e29ff520" exitCode=0 Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.898265 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-797bbc649-f7c7l" event={"ID":"f99d33d2-61e7-452c-9032-f2be6301ac6d","Type":"ContainerDied","Data":"cb61e886c7e087b2b2709b91942be4dc5c8db8b656ed0610f97ff184e29ff520"} Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.898298 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-797bbc649-f7c7l" event={"ID":"f99d33d2-61e7-452c-9032-f2be6301ac6d","Type":"ContainerDied","Data":"a4f76e64693781f3918d25a3c57fa34985b82491d7e9fc266dd5fbd41d4859d6"} Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.898316 4563 scope.go:117] "RemoveContainer" containerID="cb61e886c7e087b2b2709b91942be4dc5c8db8b656ed0610f97ff184e29ff520" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.898430 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-797bbc649-f7c7l" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.906103 4563 generic.go:334] "Generic (PLEG): container finished" podID="84697c71-a492-4dde-b02e-496673c76d98" containerID="6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571" exitCode=0 Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.906124 4563 generic.go:334] "Generic (PLEG): container finished" podID="84697c71-a492-4dde-b02e-496673c76d98" containerID="78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e" exitCode=143 Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.906161 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.906173 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84697c71-a492-4dde-b02e-496673c76d98","Type":"ContainerDied","Data":"6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571"} Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.906192 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84697c71-a492-4dde-b02e-496673c76d98","Type":"ContainerDied","Data":"78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e"} Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.906204 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84697c71-a492-4dde-b02e-496673c76d98","Type":"ContainerDied","Data":"2317e66d8116fecce05618f378abb42adaeae602c37cc002562ba1604a936290"} Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.909061 4563 generic.go:334] "Generic (PLEG): container finished" podID="156cd303-99b2-4b2b-b149-1529e89a98ed" containerID="3bbe161e57a24dce5944830853213c9ee9c5a7ec434a36e7ca752161bec6a1e4" exitCode=143 Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.909348 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"156cd303-99b2-4b2b-b149-1529e89a98ed","Type":"ContainerDied","Data":"3bbe161e57a24dce5944830853213c9ee9c5a7ec434a36e7ca752161bec6a1e4"} Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.936198 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.936224 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.936236 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.936246 4563 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.936255 4563 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.936265 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.936273 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f99d33d2-61e7-452c-9032-f2be6301ac6d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.936282 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84697c71-a492-4dde-b02e-496673c76d98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.936290 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxsj7\" (UniqueName: \"kubernetes.io/projected/f99d33d2-61e7-452c-9032-f2be6301ac6d-kube-api-access-kxsj7\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.936299 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r68sv\" (UniqueName: \"kubernetes.io/projected/84697c71-a492-4dde-b02e-496673c76d98-kube-api-access-r68sv\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.939255 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.955831 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.957322 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.959875 4563 scope.go:117] "RemoveContainer" containerID="15ccc2076e8db6b3ce09ce2b1dc14c3d41f18f90f1b46bd31571188c80a2901c" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.967015 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-797bbc649-f7c7l"] Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.972430 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-797bbc649-f7c7l"] Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.993182 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:28 crc kubenswrapper[4563]: E1124 09:20:28.993702 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84697c71-a492-4dde-b02e-496673c76d98" containerName="nova-metadata-log" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.993721 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="84697c71-a492-4dde-b02e-496673c76d98" containerName="nova-metadata-log" Nov 24 09:20:28 crc kubenswrapper[4563]: E1124 09:20:28.993749 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99d33d2-61e7-452c-9032-f2be6301ac6d" containerName="init" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.993757 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99d33d2-61e7-452c-9032-f2be6301ac6d" containerName="init" Nov 24 09:20:28 crc kubenswrapper[4563]: E1124 09:20:28.993781 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ad9a449-c587-419b-8f09-3fa89ed6a90b" containerName="nova-manage" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.993788 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad9a449-c587-419b-8f09-3fa89ed6a90b" containerName="nova-manage" Nov 24 09:20:28 crc kubenswrapper[4563]: E1124 09:20:28.993803 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84697c71-a492-4dde-b02e-496673c76d98" containerName="nova-metadata-metadata" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.993808 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="84697c71-a492-4dde-b02e-496673c76d98" containerName="nova-metadata-metadata" Nov 24 09:20:28 crc kubenswrapper[4563]: E1124 09:20:28.993819 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99d33d2-61e7-452c-9032-f2be6301ac6d" containerName="dnsmasq-dns" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.993824 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99d33d2-61e7-452c-9032-f2be6301ac6d" containerName="dnsmasq-dns" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.994013 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="84697c71-a492-4dde-b02e-496673c76d98" containerName="nova-metadata-log" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.994027 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ad9a449-c587-419b-8f09-3fa89ed6a90b" containerName="nova-manage" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.994046 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="84697c71-a492-4dde-b02e-496673c76d98" containerName="nova-metadata-metadata" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.994053 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="f99d33d2-61e7-452c-9032-f2be6301ac6d" containerName="dnsmasq-dns" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.995149 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.997088 4563 scope.go:117] "RemoveContainer" containerID="cb61e886c7e087b2b2709b91942be4dc5c8db8b656ed0610f97ff184e29ff520" Nov 24 09:20:28 crc kubenswrapper[4563]: E1124 09:20:28.997543 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb61e886c7e087b2b2709b91942be4dc5c8db8b656ed0610f97ff184e29ff520\": container with ID starting with cb61e886c7e087b2b2709b91942be4dc5c8db8b656ed0610f97ff184e29ff520 not found: ID does not exist" containerID="cb61e886c7e087b2b2709b91942be4dc5c8db8b656ed0610f97ff184e29ff520" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.997570 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb61e886c7e087b2b2709b91942be4dc5c8db8b656ed0610f97ff184e29ff520"} err="failed to get container status \"cb61e886c7e087b2b2709b91942be4dc5c8db8b656ed0610f97ff184e29ff520\": rpc error: code = NotFound desc = could not find container \"cb61e886c7e087b2b2709b91942be4dc5c8db8b656ed0610f97ff184e29ff520\": container with ID starting with cb61e886c7e087b2b2709b91942be4dc5c8db8b656ed0610f97ff184e29ff520 not found: ID does not exist" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.997591 4563 scope.go:117] "RemoveContainer" containerID="15ccc2076e8db6b3ce09ce2b1dc14c3d41f18f90f1b46bd31571188c80a2901c" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.997747 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.997946 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 24 09:20:28 crc kubenswrapper[4563]: E1124 09:20:28.998307 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15ccc2076e8db6b3ce09ce2b1dc14c3d41f18f90f1b46bd31571188c80a2901c\": container with ID starting with 15ccc2076e8db6b3ce09ce2b1dc14c3d41f18f90f1b46bd31571188c80a2901c not found: ID does not exist" containerID="15ccc2076e8db6b3ce09ce2b1dc14c3d41f18f90f1b46bd31571188c80a2901c" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.998346 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15ccc2076e8db6b3ce09ce2b1dc14c3d41f18f90f1b46bd31571188c80a2901c"} err="failed to get container status \"15ccc2076e8db6b3ce09ce2b1dc14c3d41f18f90f1b46bd31571188c80a2901c\": rpc error: code = NotFound desc = could not find container \"15ccc2076e8db6b3ce09ce2b1dc14c3d41f18f90f1b46bd31571188c80a2901c\": container with ID starting with 15ccc2076e8db6b3ce09ce2b1dc14c3d41f18f90f1b46bd31571188c80a2901c not found: ID does not exist" Nov 24 09:20:28 crc kubenswrapper[4563]: I1124 09:20:28.998372 4563 scope.go:117] "RemoveContainer" containerID="6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.002599 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.023985 4563 scope.go:117] "RemoveContainer" containerID="78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.041296 4563 scope.go:117] "RemoveContainer" containerID="6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571" Nov 24 09:20:29 crc kubenswrapper[4563]: E1124 09:20:29.041809 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571\": container with ID starting with 6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571 not found: ID does not exist" containerID="6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.041858 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571"} err="failed to get container status \"6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571\": rpc error: code = NotFound desc = could not find container \"6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571\": container with ID starting with 6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571 not found: ID does not exist" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.041900 4563 scope.go:117] "RemoveContainer" containerID="78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e" Nov 24 09:20:29 crc kubenswrapper[4563]: E1124 09:20:29.042293 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e\": container with ID starting with 78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e not found: ID does not exist" containerID="78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.042317 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e"} err="failed to get container status \"78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e\": rpc error: code = NotFound desc = could not find container \"78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e\": container with ID starting with 78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e not found: ID does not exist" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.042334 4563 scope.go:117] "RemoveContainer" containerID="6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.042721 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571"} err="failed to get container status \"6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571\": rpc error: code = NotFound desc = could not find container \"6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571\": container with ID starting with 6263d6254b2b8fe8c00d5122fc0b12c8951044924524c0b7e5e6c5c557eab571 not found: ID does not exist" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.042746 4563 scope.go:117] "RemoveContainer" containerID="78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.043056 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e"} err="failed to get container status \"78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e\": rpc error: code = NotFound desc = could not find container \"78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e\": container with ID starting with 78253240fa1cbd51e4b39e5024d7a2a39ad7217e3db0ed6d840ece5c1b2f893e not found: ID does not exist" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.072226 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84697c71-a492-4dde-b02e-496673c76d98" path="/var/lib/kubelet/pods/84697c71-a492-4dde-b02e-496673c76d98/volumes" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.072962 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f99d33d2-61e7-452c-9032-f2be6301ac6d" path="/var/lib/kubelet/pods/f99d33d2-61e7-452c-9032-f2be6301ac6d/volumes" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.141720 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.141772 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-config-data\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.141820 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.141955 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm4dx\" (UniqueName: \"kubernetes.io/projected/0301f820-dbcb-4cad-a180-7aed80a46db6-kube-api-access-mm4dx\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.141994 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0301f820-dbcb-4cad-a180-7aed80a46db6-logs\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.244078 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4dx\" (UniqueName: \"kubernetes.io/projected/0301f820-dbcb-4cad-a180-7aed80a46db6-kube-api-access-mm4dx\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.244956 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0301f820-dbcb-4cad-a180-7aed80a46db6-logs\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.245021 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0301f820-dbcb-4cad-a180-7aed80a46db6-logs\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.245160 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.245205 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-config-data\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.245227 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.248798 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-config-data\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.249153 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.252185 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.259485 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4dx\" (UniqueName: \"kubernetes.io/projected/0301f820-dbcb-4cad-a180-7aed80a46db6-kube-api-access-mm4dx\") pod \"nova-metadata-0\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.311852 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.740078 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:20:29 crc kubenswrapper[4563]: W1124 09:20:29.743758 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0301f820_dbcb_4cad_a180_7aed80a46db6.slice/crio-b093feded9204183db51b34303c50631cc5585b3e2fc47e5dd990f2c21cae7d1 WatchSource:0}: Error finding container b093feded9204183db51b34303c50631cc5585b3e2fc47e5dd990f2c21cae7d1: Status 404 returned error can't find the container with id b093feded9204183db51b34303c50631cc5585b3e2fc47e5dd990f2c21cae7d1 Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.924949 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0301f820-dbcb-4cad-a180-7aed80a46db6","Type":"ContainerStarted","Data":"8e45d102bfc00337918f022eba7fdc2ec81c58445a0f8a5b0918143e9e719448"} Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.925012 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0301f820-dbcb-4cad-a180-7aed80a46db6","Type":"ContainerStarted","Data":"b093feded9204183db51b34303c50631cc5585b3e2fc47e5dd990f2c21cae7d1"} Nov 24 09:20:29 crc kubenswrapper[4563]: I1124 09:20:29.935561 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="543b029c-5742-4c86-87c5-2c0f6dee9431" containerName="nova-scheduler-scheduler" containerID="cri-o://52d586be505a6d6205c69f838d8b9a0f5b9a32a4e239ff61aa62d9482e6447b8" gracePeriod=30 Nov 24 09:20:30 crc kubenswrapper[4563]: I1124 09:20:30.936750 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0301f820-dbcb-4cad-a180-7aed80a46db6","Type":"ContainerStarted","Data":"ed5c0d0d29cd0fdfbcabfda109f7a94e42f2acedf61384a52af79263053de4f5"} Nov 24 09:20:30 crc kubenswrapper[4563]: I1124 09:20:30.954322 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.954304064 podStartE2EDuration="2.954304064s" podCreationTimestamp="2025-11-24 09:20:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:20:30.951951417 +0000 UTC m=+1008.210928864" watchObservedRunningTime="2025-11-24 09:20:30.954304064 +0000 UTC m=+1008.213281511" Nov 24 09:20:31 crc kubenswrapper[4563]: I1124 09:20:31.265961 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 24 09:20:32 crc kubenswrapper[4563]: I1124 09:20:32.758563 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 09:20:32 crc kubenswrapper[4563]: I1124 09:20:32.934378 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vxlj\" (UniqueName: \"kubernetes.io/projected/543b029c-5742-4c86-87c5-2c0f6dee9431-kube-api-access-9vxlj\") pod \"543b029c-5742-4c86-87c5-2c0f6dee9431\" (UID: \"543b029c-5742-4c86-87c5-2c0f6dee9431\") " Nov 24 09:20:32 crc kubenswrapper[4563]: I1124 09:20:32.934431 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543b029c-5742-4c86-87c5-2c0f6dee9431-config-data\") pod \"543b029c-5742-4c86-87c5-2c0f6dee9431\" (UID: \"543b029c-5742-4c86-87c5-2c0f6dee9431\") " Nov 24 09:20:32 crc kubenswrapper[4563]: I1124 09:20:32.934536 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543b029c-5742-4c86-87c5-2c0f6dee9431-combined-ca-bundle\") pod \"543b029c-5742-4c86-87c5-2c0f6dee9431\" (UID: \"543b029c-5742-4c86-87c5-2c0f6dee9431\") " Nov 24 09:20:32 crc kubenswrapper[4563]: I1124 09:20:32.942710 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543b029c-5742-4c86-87c5-2c0f6dee9431-kube-api-access-9vxlj" (OuterVolumeSpecName: "kube-api-access-9vxlj") pod "543b029c-5742-4c86-87c5-2c0f6dee9431" (UID: "543b029c-5742-4c86-87c5-2c0f6dee9431"). InnerVolumeSpecName "kube-api-access-9vxlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:20:32 crc kubenswrapper[4563]: I1124 09:20:32.962024 4563 generic.go:334] "Generic (PLEG): container finished" podID="543b029c-5742-4c86-87c5-2c0f6dee9431" containerID="52d586be505a6d6205c69f838d8b9a0f5b9a32a4e239ff61aa62d9482e6447b8" exitCode=0 Nov 24 09:20:32 crc kubenswrapper[4563]: I1124 09:20:32.962097 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 09:20:32 crc kubenswrapper[4563]: I1124 09:20:32.962101 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"543b029c-5742-4c86-87c5-2c0f6dee9431","Type":"ContainerDied","Data":"52d586be505a6d6205c69f838d8b9a0f5b9a32a4e239ff61aa62d9482e6447b8"} Nov 24 09:20:32 crc kubenswrapper[4563]: I1124 09:20:32.962161 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"543b029c-5742-4c86-87c5-2c0f6dee9431","Type":"ContainerDied","Data":"7ead6355a31689e297d874dc6d568c63c15decd9e5bef9bce8c25974cb0be577"} Nov 24 09:20:32 crc kubenswrapper[4563]: I1124 09:20:32.962185 4563 scope.go:117] "RemoveContainer" containerID="52d586be505a6d6205c69f838d8b9a0f5b9a32a4e239ff61aa62d9482e6447b8" Nov 24 09:20:32 crc kubenswrapper[4563]: I1124 09:20:32.963142 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/543b029c-5742-4c86-87c5-2c0f6dee9431-config-data" (OuterVolumeSpecName: "config-data") pod "543b029c-5742-4c86-87c5-2c0f6dee9431" (UID: "543b029c-5742-4c86-87c5-2c0f6dee9431"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:32 crc kubenswrapper[4563]: I1124 09:20:32.969197 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/543b029c-5742-4c86-87c5-2c0f6dee9431-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "543b029c-5742-4c86-87c5-2c0f6dee9431" (UID: "543b029c-5742-4c86-87c5-2c0f6dee9431"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.026861 4563 scope.go:117] "RemoveContainer" containerID="52d586be505a6d6205c69f838d8b9a0f5b9a32a4e239ff61aa62d9482e6447b8" Nov 24 09:20:33 crc kubenswrapper[4563]: E1124 09:20:33.027492 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d586be505a6d6205c69f838d8b9a0f5b9a32a4e239ff61aa62d9482e6447b8\": container with ID starting with 52d586be505a6d6205c69f838d8b9a0f5b9a32a4e239ff61aa62d9482e6447b8 not found: ID does not exist" containerID="52d586be505a6d6205c69f838d8b9a0f5b9a32a4e239ff61aa62d9482e6447b8" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.027542 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d586be505a6d6205c69f838d8b9a0f5b9a32a4e239ff61aa62d9482e6447b8"} err="failed to get container status \"52d586be505a6d6205c69f838d8b9a0f5b9a32a4e239ff61aa62d9482e6447b8\": rpc error: code = NotFound desc = could not find container \"52d586be505a6d6205c69f838d8b9a0f5b9a32a4e239ff61aa62d9482e6447b8\": container with ID starting with 52d586be505a6d6205c69f838d8b9a0f5b9a32a4e239ff61aa62d9482e6447b8 not found: ID does not exist" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.037938 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543b029c-5742-4c86-87c5-2c0f6dee9431-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.037970 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vxlj\" (UniqueName: \"kubernetes.io/projected/543b029c-5742-4c86-87c5-2c0f6dee9431-kube-api-access-9vxlj\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.037985 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543b029c-5742-4c86-87c5-2c0f6dee9431-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.296764 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.310235 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.331549 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:20:33 crc kubenswrapper[4563]: E1124 09:20:33.332124 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543b029c-5742-4c86-87c5-2c0f6dee9431" containerName="nova-scheduler-scheduler" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.337559 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="543b029c-5742-4c86-87c5-2c0f6dee9431" containerName="nova-scheduler-scheduler" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.338053 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="543b029c-5742-4c86-87c5-2c0f6dee9431" containerName="nova-scheduler-scheduler" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.338939 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.341806 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.348303 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.361034 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvm8r\" (UniqueName: \"kubernetes.io/projected/5021efe4-f1d2-4762-a196-c2f9b4266ba9-kube-api-access-gvm8r\") pod \"nova-scheduler-0\" (UID: \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.361426 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5021efe4-f1d2-4762-a196-c2f9b4266ba9-config-data\") pod \"nova-scheduler-0\" (UID: \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.361475 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5021efe4-f1d2-4762-a196-c2f9b4266ba9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.466284 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvm8r\" (UniqueName: \"kubernetes.io/projected/5021efe4-f1d2-4762-a196-c2f9b4266ba9-kube-api-access-gvm8r\") pod \"nova-scheduler-0\" (UID: \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.466487 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5021efe4-f1d2-4762-a196-c2f9b4266ba9-config-data\") pod \"nova-scheduler-0\" (UID: \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.466549 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5021efe4-f1d2-4762-a196-c2f9b4266ba9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.474085 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5021efe4-f1d2-4762-a196-c2f9b4266ba9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.474100 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5021efe4-f1d2-4762-a196-c2f9b4266ba9-config-data\") pod \"nova-scheduler-0\" (UID: \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.482943 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvm8r\" (UniqueName: \"kubernetes.io/projected/5021efe4-f1d2-4762-a196-c2f9b4266ba9-kube-api-access-gvm8r\") pod \"nova-scheduler-0\" (UID: \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\") " pod="openstack/nova-scheduler-0" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.663830 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.835185 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.879348 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/156cd303-99b2-4b2b-b149-1529e89a98ed-logs\") pod \"156cd303-99b2-4b2b-b149-1529e89a98ed\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.879407 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156cd303-99b2-4b2b-b149-1529e89a98ed-config-data\") pod \"156cd303-99b2-4b2b-b149-1529e89a98ed\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.879483 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2f7j\" (UniqueName: \"kubernetes.io/projected/156cd303-99b2-4b2b-b149-1529e89a98ed-kube-api-access-f2f7j\") pod \"156cd303-99b2-4b2b-b149-1529e89a98ed\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.879591 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156cd303-99b2-4b2b-b149-1529e89a98ed-combined-ca-bundle\") pod \"156cd303-99b2-4b2b-b149-1529e89a98ed\" (UID: \"156cd303-99b2-4b2b-b149-1529e89a98ed\") " Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.880443 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/156cd303-99b2-4b2b-b149-1529e89a98ed-logs" (OuterVolumeSpecName: "logs") pod "156cd303-99b2-4b2b-b149-1529e89a98ed" (UID: "156cd303-99b2-4b2b-b149-1529e89a98ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.884910 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/156cd303-99b2-4b2b-b149-1529e89a98ed-kube-api-access-f2f7j" (OuterVolumeSpecName: "kube-api-access-f2f7j") pod "156cd303-99b2-4b2b-b149-1529e89a98ed" (UID: "156cd303-99b2-4b2b-b149-1529e89a98ed"). InnerVolumeSpecName "kube-api-access-f2f7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.905502 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156cd303-99b2-4b2b-b149-1529e89a98ed-config-data" (OuterVolumeSpecName: "config-data") pod "156cd303-99b2-4b2b-b149-1529e89a98ed" (UID: "156cd303-99b2-4b2b-b149-1529e89a98ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.907419 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156cd303-99b2-4b2b-b149-1529e89a98ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "156cd303-99b2-4b2b-b149-1529e89a98ed" (UID: "156cd303-99b2-4b2b-b149-1529e89a98ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.974128 4563 generic.go:334] "Generic (PLEG): container finished" podID="156cd303-99b2-4b2b-b149-1529e89a98ed" containerID="96237b7b48247229d5b773a81bc65af32178722ae26cb0431156e33c7eaebb09" exitCode=0 Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.974205 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"156cd303-99b2-4b2b-b149-1529e89a98ed","Type":"ContainerDied","Data":"96237b7b48247229d5b773a81bc65af32178722ae26cb0431156e33c7eaebb09"} Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.974257 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.974309 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"156cd303-99b2-4b2b-b149-1529e89a98ed","Type":"ContainerDied","Data":"61ff65de1298b1169980a5350f69d4c503735f0e34a2a510d1d2dd8e76a9bf3b"} Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.974338 4563 scope.go:117] "RemoveContainer" containerID="96237b7b48247229d5b773a81bc65af32178722ae26cb0431156e33c7eaebb09" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.981867 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/156cd303-99b2-4b2b-b149-1529e89a98ed-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.981900 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156cd303-99b2-4b2b-b149-1529e89a98ed-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.981914 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2f7j\" (UniqueName: \"kubernetes.io/projected/156cd303-99b2-4b2b-b149-1529e89a98ed-kube-api-access-f2f7j\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:33 crc kubenswrapper[4563]: I1124 09:20:33.981929 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156cd303-99b2-4b2b-b149-1529e89a98ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.006336 4563 scope.go:117] "RemoveContainer" containerID="3bbe161e57a24dce5944830853213c9ee9c5a7ec434a36e7ca752161bec6a1e4" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.009869 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.024349 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.034172 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 24 09:20:34 crc kubenswrapper[4563]: E1124 09:20:34.034633 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156cd303-99b2-4b2b-b149-1529e89a98ed" containerName="nova-api-log" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.034662 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="156cd303-99b2-4b2b-b149-1529e89a98ed" containerName="nova-api-log" Nov 24 09:20:34 crc kubenswrapper[4563]: E1124 09:20:34.034701 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156cd303-99b2-4b2b-b149-1529e89a98ed" containerName="nova-api-api" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.034707 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="156cd303-99b2-4b2b-b149-1529e89a98ed" containerName="nova-api-api" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.034879 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="156cd303-99b2-4b2b-b149-1529e89a98ed" containerName="nova-api-api" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.034896 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="156cd303-99b2-4b2b-b149-1529e89a98ed" containerName="nova-api-log" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.035956 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.037489 4563 scope.go:117] "RemoveContainer" containerID="96237b7b48247229d5b773a81bc65af32178722ae26cb0431156e33c7eaebb09" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.037711 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 24 09:20:34 crc kubenswrapper[4563]: E1124 09:20:34.038169 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96237b7b48247229d5b773a81bc65af32178722ae26cb0431156e33c7eaebb09\": container with ID starting with 96237b7b48247229d5b773a81bc65af32178722ae26cb0431156e33c7eaebb09 not found: ID does not exist" containerID="96237b7b48247229d5b773a81bc65af32178722ae26cb0431156e33c7eaebb09" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.038208 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96237b7b48247229d5b773a81bc65af32178722ae26cb0431156e33c7eaebb09"} err="failed to get container status \"96237b7b48247229d5b773a81bc65af32178722ae26cb0431156e33c7eaebb09\": rpc error: code = NotFound desc = could not find container \"96237b7b48247229d5b773a81bc65af32178722ae26cb0431156e33c7eaebb09\": container with ID starting with 96237b7b48247229d5b773a81bc65af32178722ae26cb0431156e33c7eaebb09 not found: ID does not exist" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.038280 4563 scope.go:117] "RemoveContainer" containerID="3bbe161e57a24dce5944830853213c9ee9c5a7ec434a36e7ca752161bec6a1e4" Nov 24 09:20:34 crc kubenswrapper[4563]: E1124 09:20:34.039071 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bbe161e57a24dce5944830853213c9ee9c5a7ec434a36e7ca752161bec6a1e4\": container with ID starting with 3bbe161e57a24dce5944830853213c9ee9c5a7ec434a36e7ca752161bec6a1e4 not found: ID does not exist" containerID="3bbe161e57a24dce5944830853213c9ee9c5a7ec434a36e7ca752161bec6a1e4" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.039098 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bbe161e57a24dce5944830853213c9ee9c5a7ec434a36e7ca752161bec6a1e4"} err="failed to get container status \"3bbe161e57a24dce5944830853213c9ee9c5a7ec434a36e7ca752161bec6a1e4\": rpc error: code = NotFound desc = could not find container \"3bbe161e57a24dce5944830853213c9ee9c5a7ec434a36e7ca752161bec6a1e4\": container with ID starting with 3bbe161e57a24dce5944830853213c9ee9c5a7ec434a36e7ca752161bec6a1e4 not found: ID does not exist" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.045186 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.083629 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/961502a3-4e3a-49f0-a8d1-67e462c289e5-logs\") pod \"nova-api-0\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " pod="openstack/nova-api-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.083709 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/961502a3-4e3a-49f0-a8d1-67e462c289e5-config-data\") pod \"nova-api-0\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " pod="openstack/nova-api-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.083732 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsd8w\" (UniqueName: \"kubernetes.io/projected/961502a3-4e3a-49f0-a8d1-67e462c289e5-kube-api-access-lsd8w\") pod \"nova-api-0\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " pod="openstack/nova-api-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.084108 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/961502a3-4e3a-49f0-a8d1-67e462c289e5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " pod="openstack/nova-api-0" Nov 24 09:20:34 crc kubenswrapper[4563]: W1124 09:20:34.084757 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5021efe4_f1d2_4762_a196_c2f9b4266ba9.slice/crio-7bc4a36f059098286445a01e2c4698c1567df2c7826f3c204cb759a90fe7a609 WatchSource:0}: Error finding container 7bc4a36f059098286445a01e2c4698c1567df2c7826f3c204cb759a90fe7a609: Status 404 returned error can't find the container with id 7bc4a36f059098286445a01e2c4698c1567df2c7826f3c204cb759a90fe7a609 Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.086511 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.186059 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/961502a3-4e3a-49f0-a8d1-67e462c289e5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " pod="openstack/nova-api-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.186150 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/961502a3-4e3a-49f0-a8d1-67e462c289e5-logs\") pod \"nova-api-0\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " pod="openstack/nova-api-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.186179 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/961502a3-4e3a-49f0-a8d1-67e462c289e5-config-data\") pod \"nova-api-0\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " pod="openstack/nova-api-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.186197 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsd8w\" (UniqueName: \"kubernetes.io/projected/961502a3-4e3a-49f0-a8d1-67e462c289e5-kube-api-access-lsd8w\") pod \"nova-api-0\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " pod="openstack/nova-api-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.186687 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/961502a3-4e3a-49f0-a8d1-67e462c289e5-logs\") pod \"nova-api-0\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " pod="openstack/nova-api-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.190553 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/961502a3-4e3a-49f0-a8d1-67e462c289e5-config-data\") pod \"nova-api-0\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " pod="openstack/nova-api-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.190915 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/961502a3-4e3a-49f0-a8d1-67e462c289e5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " pod="openstack/nova-api-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.201275 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsd8w\" (UniqueName: \"kubernetes.io/projected/961502a3-4e3a-49f0-a8d1-67e462c289e5-kube-api-access-lsd8w\") pod \"nova-api-0\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " pod="openstack/nova-api-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.312157 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.312515 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.353374 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.761014 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:20:34 crc kubenswrapper[4563]: W1124 09:20:34.772442 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod961502a3_4e3a_49f0_a8d1_67e462c289e5.slice/crio-57cdba65fe7e06a3047ec732ab2bf7d2b752c184d0c554dbb4c2a0396e6c15f3 WatchSource:0}: Error finding container 57cdba65fe7e06a3047ec732ab2bf7d2b752c184d0c554dbb4c2a0396e6c15f3: Status 404 returned error can't find the container with id 57cdba65fe7e06a3047ec732ab2bf7d2b752c184d0c554dbb4c2a0396e6c15f3 Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.995912 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5021efe4-f1d2-4762-a196-c2f9b4266ba9","Type":"ContainerStarted","Data":"43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7"} Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.995959 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5021efe4-f1d2-4762-a196-c2f9b4266ba9","Type":"ContainerStarted","Data":"7bc4a36f059098286445a01e2c4698c1567df2c7826f3c204cb759a90fe7a609"} Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.999744 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"961502a3-4e3a-49f0-a8d1-67e462c289e5","Type":"ContainerStarted","Data":"deb1e86a08f2f8b30c21ff687cc39cdf3150e66713f4ac65fa23d2e7201552ed"} Nov 24 09:20:34 crc kubenswrapper[4563]: I1124 09:20:34.999769 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"961502a3-4e3a-49f0-a8d1-67e462c289e5","Type":"ContainerStarted","Data":"57cdba65fe7e06a3047ec732ab2bf7d2b752c184d0c554dbb4c2a0396e6c15f3"} Nov 24 09:20:35 crc kubenswrapper[4563]: I1124 09:20:35.019451 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.019434708 podStartE2EDuration="2.019434708s" podCreationTimestamp="2025-11-24 09:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:20:35.010692481 +0000 UTC m=+1012.269669927" watchObservedRunningTime="2025-11-24 09:20:35.019434708 +0000 UTC m=+1012.278412155" Nov 24 09:20:35 crc kubenswrapper[4563]: I1124 09:20:35.065508 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="156cd303-99b2-4b2b-b149-1529e89a98ed" path="/var/lib/kubelet/pods/156cd303-99b2-4b2b-b149-1529e89a98ed/volumes" Nov 24 09:20:35 crc kubenswrapper[4563]: I1124 09:20:35.066293 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="543b029c-5742-4c86-87c5-2c0f6dee9431" path="/var/lib/kubelet/pods/543b029c-5742-4c86-87c5-2c0f6dee9431/volumes" Nov 24 09:20:36 crc kubenswrapper[4563]: I1124 09:20:36.020864 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"961502a3-4e3a-49f0-a8d1-67e462c289e5","Type":"ContainerStarted","Data":"3cbbf00b512a0046ad8a6f6e50bb7d228e29c81f5383c2f81b308c3a3d22e65b"} Nov 24 09:20:36 crc kubenswrapper[4563]: I1124 09:20:36.044498 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.044478716 podStartE2EDuration="2.044478716s" podCreationTimestamp="2025-11-24 09:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:20:36.037493422 +0000 UTC m=+1013.296470869" watchObservedRunningTime="2025-11-24 09:20:36.044478716 +0000 UTC m=+1013.303456163" Nov 24 09:20:38 crc kubenswrapper[4563]: I1124 09:20:38.664014 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 24 09:20:38 crc kubenswrapper[4563]: I1124 09:20:38.987667 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:20:38 crc kubenswrapper[4563]: I1124 09:20:38.987760 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:20:39 crc kubenswrapper[4563]: I1124 09:20:39.312253 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 09:20:39 crc kubenswrapper[4563]: I1124 09:20:39.312578 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 09:20:40 crc kubenswrapper[4563]: I1124 09:20:40.013249 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 24 09:20:40 crc kubenswrapper[4563]: I1124 09:20:40.328791 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0301f820-dbcb-4cad-a180-7aed80a46db6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 24 09:20:40 crc kubenswrapper[4563]: I1124 09:20:40.328919 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0301f820-dbcb-4cad-a180-7aed80a46db6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 24 09:20:43 crc kubenswrapper[4563]: I1124 09:20:43.401233 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 09:20:43 crc kubenswrapper[4563]: I1124 09:20:43.402117 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8d28277e-c9e2-4e14-bcda-b8e7684ce6f2" containerName="kube-state-metrics" containerID="cri-o://ee4389b6221b7cf681e00888ee60280859a24a1c697b600bfd3959daf6d7fb54" gracePeriod=30 Nov 24 09:20:43 crc kubenswrapper[4563]: I1124 09:20:43.671719 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 24 09:20:43 crc kubenswrapper[4563]: I1124 09:20:43.704322 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 24 09:20:43 crc kubenswrapper[4563]: I1124 09:20:43.831143 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 09:20:43 crc kubenswrapper[4563]: I1124 09:20:43.988915 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whrk7\" (UniqueName: \"kubernetes.io/projected/8d28277e-c9e2-4e14-bcda-b8e7684ce6f2-kube-api-access-whrk7\") pod \"8d28277e-c9e2-4e14-bcda-b8e7684ce6f2\" (UID: \"8d28277e-c9e2-4e14-bcda-b8e7684ce6f2\") " Nov 24 09:20:43 crc kubenswrapper[4563]: I1124 09:20:43.996162 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d28277e-c9e2-4e14-bcda-b8e7684ce6f2-kube-api-access-whrk7" (OuterVolumeSpecName: "kube-api-access-whrk7") pod "8d28277e-c9e2-4e14-bcda-b8e7684ce6f2" (UID: "8d28277e-c9e2-4e14-bcda-b8e7684ce6f2"). InnerVolumeSpecName "kube-api-access-whrk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.092402 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whrk7\" (UniqueName: \"kubernetes.io/projected/8d28277e-c9e2-4e14-bcda-b8e7684ce6f2-kube-api-access-whrk7\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.103692 4563 generic.go:334] "Generic (PLEG): container finished" podID="8d28277e-c9e2-4e14-bcda-b8e7684ce6f2" containerID="ee4389b6221b7cf681e00888ee60280859a24a1c697b600bfd3959daf6d7fb54" exitCode=2 Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.104787 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.112229 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8d28277e-c9e2-4e14-bcda-b8e7684ce6f2","Type":"ContainerDied","Data":"ee4389b6221b7cf681e00888ee60280859a24a1c697b600bfd3959daf6d7fb54"} Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.112265 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8d28277e-c9e2-4e14-bcda-b8e7684ce6f2","Type":"ContainerDied","Data":"556f1a86d4d9bd6d35e66635f8b463fcd483675a1fb633a05684859f95200bbd"} Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.112285 4563 scope.go:117] "RemoveContainer" containerID="ee4389b6221b7cf681e00888ee60280859a24a1c697b600bfd3959daf6d7fb54" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.136142 4563 scope.go:117] "RemoveContainer" containerID="ee4389b6221b7cf681e00888ee60280859a24a1c697b600bfd3959daf6d7fb54" Nov 24 09:20:44 crc kubenswrapper[4563]: E1124 09:20:44.140091 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee4389b6221b7cf681e00888ee60280859a24a1c697b600bfd3959daf6d7fb54\": container with ID starting with ee4389b6221b7cf681e00888ee60280859a24a1c697b600bfd3959daf6d7fb54 not found: ID does not exist" containerID="ee4389b6221b7cf681e00888ee60280859a24a1c697b600bfd3959daf6d7fb54" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.140213 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee4389b6221b7cf681e00888ee60280859a24a1c697b600bfd3959daf6d7fb54"} err="failed to get container status \"ee4389b6221b7cf681e00888ee60280859a24a1c697b600bfd3959daf6d7fb54\": rpc error: code = NotFound desc = could not find container \"ee4389b6221b7cf681e00888ee60280859a24a1c697b600bfd3959daf6d7fb54\": container with ID starting with ee4389b6221b7cf681e00888ee60280859a24a1c697b600bfd3959daf6d7fb54 not found: ID does not exist" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.150040 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.152011 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.162395 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.170623 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 09:20:44 crc kubenswrapper[4563]: E1124 09:20:44.171303 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d28277e-c9e2-4e14-bcda-b8e7684ce6f2" containerName="kube-state-metrics" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.171324 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d28277e-c9e2-4e14-bcda-b8e7684ce6f2" containerName="kube-state-metrics" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.171589 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d28277e-c9e2-4e14-bcda-b8e7684ce6f2" containerName="kube-state-metrics" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.172443 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.175527 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.175527 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.180595 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.297769 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xxg9\" (UniqueName: \"kubernetes.io/projected/86c40cc3-1c2a-47db-9ed2-eb746b65ac4b-kube-api-access-4xxg9\") pod \"kube-state-metrics-0\" (UID: \"86c40cc3-1c2a-47db-9ed2-eb746b65ac4b\") " pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.297990 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/86c40cc3-1c2a-47db-9ed2-eb746b65ac4b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"86c40cc3-1c2a-47db-9ed2-eb746b65ac4b\") " pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.298032 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c40cc3-1c2a-47db-9ed2-eb746b65ac4b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"86c40cc3-1c2a-47db-9ed2-eb746b65ac4b\") " pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.298077 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/86c40cc3-1c2a-47db-9ed2-eb746b65ac4b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"86c40cc3-1c2a-47db-9ed2-eb746b65ac4b\") " pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.354150 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.354201 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.400427 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xxg9\" (UniqueName: \"kubernetes.io/projected/86c40cc3-1c2a-47db-9ed2-eb746b65ac4b-kube-api-access-4xxg9\") pod \"kube-state-metrics-0\" (UID: \"86c40cc3-1c2a-47db-9ed2-eb746b65ac4b\") " pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.400674 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/86c40cc3-1c2a-47db-9ed2-eb746b65ac4b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"86c40cc3-1c2a-47db-9ed2-eb746b65ac4b\") " pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.400740 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c40cc3-1c2a-47db-9ed2-eb746b65ac4b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"86c40cc3-1c2a-47db-9ed2-eb746b65ac4b\") " pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.400773 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/86c40cc3-1c2a-47db-9ed2-eb746b65ac4b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"86c40cc3-1c2a-47db-9ed2-eb746b65ac4b\") " pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.406428 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/86c40cc3-1c2a-47db-9ed2-eb746b65ac4b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"86c40cc3-1c2a-47db-9ed2-eb746b65ac4b\") " pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.413930 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c40cc3-1c2a-47db-9ed2-eb746b65ac4b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"86c40cc3-1c2a-47db-9ed2-eb746b65ac4b\") " pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.414336 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/86c40cc3-1c2a-47db-9ed2-eb746b65ac4b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"86c40cc3-1c2a-47db-9ed2-eb746b65ac4b\") " pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.414965 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xxg9\" (UniqueName: \"kubernetes.io/projected/86c40cc3-1c2a-47db-9ed2-eb746b65ac4b-kube-api-access-4xxg9\") pod \"kube-state-metrics-0\" (UID: \"86c40cc3-1c2a-47db-9ed2-eb746b65ac4b\") " pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.494784 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 24 09:20:44 crc kubenswrapper[4563]: I1124 09:20:44.933036 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 24 09:20:45 crc kubenswrapper[4563]: I1124 09:20:45.000176 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:20:45 crc kubenswrapper[4563]: I1124 09:20:45.000549 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="ceilometer-central-agent" containerID="cri-o://8a3f77239c4245694b9ea33de4825b5afd4bc159721329407c3278485cad7808" gracePeriod=30 Nov 24 09:20:45 crc kubenswrapper[4563]: I1124 09:20:45.000763 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="proxy-httpd" containerID="cri-o://6cc9511aa58c5a49ef8fa255825bc26a1ae59d08f70be4112f8547340ced9d32" gracePeriod=30 Nov 24 09:20:45 crc kubenswrapper[4563]: I1124 09:20:45.000821 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="sg-core" containerID="cri-o://6619eae2703b15e362e62ea50db8761c10b2ff961756f1432e95fd86594a5c87" gracePeriod=30 Nov 24 09:20:45 crc kubenswrapper[4563]: I1124 09:20:45.000863 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="ceilometer-notification-agent" containerID="cri-o://11abab118e1b19bf81567d54eca3d64b6079a6410d25f3727f652e05ddd2420a" gracePeriod=30 Nov 24 09:20:45 crc kubenswrapper[4563]: I1124 09:20:45.067008 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d28277e-c9e2-4e14-bcda-b8e7684ce6f2" path="/var/lib/kubelet/pods/8d28277e-c9e2-4e14-bcda-b8e7684ce6f2/volumes" Nov 24 09:20:45 crc kubenswrapper[4563]: I1124 09:20:45.118477 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"86c40cc3-1c2a-47db-9ed2-eb746b65ac4b","Type":"ContainerStarted","Data":"3ea263dbb241cc3cc90069dd402d327ee3066735fe4899902800cddc4595007d"} Nov 24 09:20:45 crc kubenswrapper[4563]: I1124 09:20:45.436812 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="961502a3-4e3a-49f0-a8d1-67e462c289e5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 09:20:45 crc kubenswrapper[4563]: I1124 09:20:45.436813 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="961502a3-4e3a-49f0-a8d1-67e462c289e5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 24 09:20:46 crc kubenswrapper[4563]: I1124 09:20:46.128535 4563 generic.go:334] "Generic (PLEG): container finished" podID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerID="6cc9511aa58c5a49ef8fa255825bc26a1ae59d08f70be4112f8547340ced9d32" exitCode=0 Nov 24 09:20:46 crc kubenswrapper[4563]: I1124 09:20:46.128863 4563 generic.go:334] "Generic (PLEG): container finished" podID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerID="6619eae2703b15e362e62ea50db8761c10b2ff961756f1432e95fd86594a5c87" exitCode=2 Nov 24 09:20:46 crc kubenswrapper[4563]: I1124 09:20:46.128874 4563 generic.go:334] "Generic (PLEG): container finished" podID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerID="8a3f77239c4245694b9ea33de4825b5afd4bc159721329407c3278485cad7808" exitCode=0 Nov 24 09:20:46 crc kubenswrapper[4563]: I1124 09:20:46.128745 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b","Type":"ContainerDied","Data":"6cc9511aa58c5a49ef8fa255825bc26a1ae59d08f70be4112f8547340ced9d32"} Nov 24 09:20:46 crc kubenswrapper[4563]: I1124 09:20:46.128943 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b","Type":"ContainerDied","Data":"6619eae2703b15e362e62ea50db8761c10b2ff961756f1432e95fd86594a5c87"} Nov 24 09:20:46 crc kubenswrapper[4563]: I1124 09:20:46.128959 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b","Type":"ContainerDied","Data":"8a3f77239c4245694b9ea33de4825b5afd4bc159721329407c3278485cad7808"} Nov 24 09:20:46 crc kubenswrapper[4563]: I1124 09:20:46.130140 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"86c40cc3-1c2a-47db-9ed2-eb746b65ac4b","Type":"ContainerStarted","Data":"d37d13b4cad3fe53059efd65f80b915801e4d7d1a49f8ae4e7fac8cd50fd6384"} Nov 24 09:20:46 crc kubenswrapper[4563]: I1124 09:20:46.131461 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 24 09:20:46 crc kubenswrapper[4563]: I1124 09:20:46.154197 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.886703695 podStartE2EDuration="2.154181259s" podCreationTimestamp="2025-11-24 09:20:44 +0000 UTC" firstStartedPulling="2025-11-24 09:20:44.936267369 +0000 UTC m=+1022.195244816" lastFinishedPulling="2025-11-24 09:20:45.203744933 +0000 UTC m=+1022.462722380" observedRunningTime="2025-11-24 09:20:46.143969681 +0000 UTC m=+1023.402947128" watchObservedRunningTime="2025-11-24 09:20:46.154181259 +0000 UTC m=+1023.413158707" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.770709 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.880709 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-scripts\") pod \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.880759 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-config-data\") pod \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.880948 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-combined-ca-bundle\") pod \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.881209 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-log-httpd\") pod \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.881238 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-sg-core-conf-yaml\") pod \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.881311 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-run-httpd\") pod \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.881400 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d67wf\" (UniqueName: \"kubernetes.io/projected/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-kube-api-access-d67wf\") pod \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\" (UID: \"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b\") " Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.881626 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" (UID: "ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.881739 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" (UID: "ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.881922 4563 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.881937 4563 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.888848 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-scripts" (OuterVolumeSpecName: "scripts") pod "ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" (UID: "ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.892761 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-kube-api-access-d67wf" (OuterVolumeSpecName: "kube-api-access-d67wf") pod "ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" (UID: "ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b"). InnerVolumeSpecName "kube-api-access-d67wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.908128 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" (UID: "ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.957005 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" (UID: "ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.971115 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-config-data" (OuterVolumeSpecName: "config-data") pod "ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" (UID: "ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.983446 4563 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.983482 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d67wf\" (UniqueName: \"kubernetes.io/projected/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-kube-api-access-d67wf\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.983497 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.983510 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:47 crc kubenswrapper[4563]: I1124 09:20:47.983520 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.176657 4563 generic.go:334] "Generic (PLEG): container finished" podID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerID="11abab118e1b19bf81567d54eca3d64b6079a6410d25f3727f652e05ddd2420a" exitCode=0 Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.176701 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b","Type":"ContainerDied","Data":"11abab118e1b19bf81567d54eca3d64b6079a6410d25f3727f652e05ddd2420a"} Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.176759 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b","Type":"ContainerDied","Data":"1d133de685ad21e08c3435cbd0182b25eefe728d9286705dd31ebc5e0088fc34"} Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.176783 4563 scope.go:117] "RemoveContainer" containerID="6cc9511aa58c5a49ef8fa255825bc26a1ae59d08f70be4112f8547340ced9d32" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.176715 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.204532 4563 scope.go:117] "RemoveContainer" containerID="6619eae2703b15e362e62ea50db8761c10b2ff961756f1432e95fd86594a5c87" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.213744 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.222714 4563 scope.go:117] "RemoveContainer" containerID="11abab118e1b19bf81567d54eca3d64b6079a6410d25f3727f652e05ddd2420a" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.225743 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.231018 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:20:48 crc kubenswrapper[4563]: E1124 09:20:48.231621 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="ceilometer-notification-agent" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.231654 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="ceilometer-notification-agent" Nov 24 09:20:48 crc kubenswrapper[4563]: E1124 09:20:48.231669 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="ceilometer-central-agent" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.231676 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="ceilometer-central-agent" Nov 24 09:20:48 crc kubenswrapper[4563]: E1124 09:20:48.231697 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="sg-core" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.231703 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="sg-core" Nov 24 09:20:48 crc kubenswrapper[4563]: E1124 09:20:48.231715 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="proxy-httpd" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.231721 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="proxy-httpd" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.232008 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="ceilometer-notification-agent" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.232030 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="proxy-httpd" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.232043 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="sg-core" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.232051 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" containerName="ceilometer-central-agent" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.233915 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.236284 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.236565 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.239964 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.240580 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.248313 4563 scope.go:117] "RemoveContainer" containerID="8a3f77239c4245694b9ea33de4825b5afd4bc159721329407c3278485cad7808" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.271727 4563 scope.go:117] "RemoveContainer" containerID="6cc9511aa58c5a49ef8fa255825bc26a1ae59d08f70be4112f8547340ced9d32" Nov 24 09:20:48 crc kubenswrapper[4563]: E1124 09:20:48.272205 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cc9511aa58c5a49ef8fa255825bc26a1ae59d08f70be4112f8547340ced9d32\": container with ID starting with 6cc9511aa58c5a49ef8fa255825bc26a1ae59d08f70be4112f8547340ced9d32 not found: ID does not exist" containerID="6cc9511aa58c5a49ef8fa255825bc26a1ae59d08f70be4112f8547340ced9d32" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.272293 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cc9511aa58c5a49ef8fa255825bc26a1ae59d08f70be4112f8547340ced9d32"} err="failed to get container status \"6cc9511aa58c5a49ef8fa255825bc26a1ae59d08f70be4112f8547340ced9d32\": rpc error: code = NotFound desc = could not find container \"6cc9511aa58c5a49ef8fa255825bc26a1ae59d08f70be4112f8547340ced9d32\": container with ID starting with 6cc9511aa58c5a49ef8fa255825bc26a1ae59d08f70be4112f8547340ced9d32 not found: ID does not exist" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.272381 4563 scope.go:117] "RemoveContainer" containerID="6619eae2703b15e362e62ea50db8761c10b2ff961756f1432e95fd86594a5c87" Nov 24 09:20:48 crc kubenswrapper[4563]: E1124 09:20:48.272860 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6619eae2703b15e362e62ea50db8761c10b2ff961756f1432e95fd86594a5c87\": container with ID starting with 6619eae2703b15e362e62ea50db8761c10b2ff961756f1432e95fd86594a5c87 not found: ID does not exist" containerID="6619eae2703b15e362e62ea50db8761c10b2ff961756f1432e95fd86594a5c87" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.272934 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6619eae2703b15e362e62ea50db8761c10b2ff961756f1432e95fd86594a5c87"} err="failed to get container status \"6619eae2703b15e362e62ea50db8761c10b2ff961756f1432e95fd86594a5c87\": rpc error: code = NotFound desc = could not find container \"6619eae2703b15e362e62ea50db8761c10b2ff961756f1432e95fd86594a5c87\": container with ID starting with 6619eae2703b15e362e62ea50db8761c10b2ff961756f1432e95fd86594a5c87 not found: ID does not exist" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.272995 4563 scope.go:117] "RemoveContainer" containerID="11abab118e1b19bf81567d54eca3d64b6079a6410d25f3727f652e05ddd2420a" Nov 24 09:20:48 crc kubenswrapper[4563]: E1124 09:20:48.273345 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11abab118e1b19bf81567d54eca3d64b6079a6410d25f3727f652e05ddd2420a\": container with ID starting with 11abab118e1b19bf81567d54eca3d64b6079a6410d25f3727f652e05ddd2420a not found: ID does not exist" containerID="11abab118e1b19bf81567d54eca3d64b6079a6410d25f3727f652e05ddd2420a" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.273434 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11abab118e1b19bf81567d54eca3d64b6079a6410d25f3727f652e05ddd2420a"} err="failed to get container status \"11abab118e1b19bf81567d54eca3d64b6079a6410d25f3727f652e05ddd2420a\": rpc error: code = NotFound desc = could not find container \"11abab118e1b19bf81567d54eca3d64b6079a6410d25f3727f652e05ddd2420a\": container with ID starting with 11abab118e1b19bf81567d54eca3d64b6079a6410d25f3727f652e05ddd2420a not found: ID does not exist" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.273512 4563 scope.go:117] "RemoveContainer" containerID="8a3f77239c4245694b9ea33de4825b5afd4bc159721329407c3278485cad7808" Nov 24 09:20:48 crc kubenswrapper[4563]: E1124 09:20:48.273971 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a3f77239c4245694b9ea33de4825b5afd4bc159721329407c3278485cad7808\": container with ID starting with 8a3f77239c4245694b9ea33de4825b5afd4bc159721329407c3278485cad7808 not found: ID does not exist" containerID="8a3f77239c4245694b9ea33de4825b5afd4bc159721329407c3278485cad7808" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.274029 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a3f77239c4245694b9ea33de4825b5afd4bc159721329407c3278485cad7808"} err="failed to get container status \"8a3f77239c4245694b9ea33de4825b5afd4bc159721329407c3278485cad7808\": rpc error: code = NotFound desc = could not find container \"8a3f77239c4245694b9ea33de4825b5afd4bc159721329407c3278485cad7808\": container with ID starting with 8a3f77239c4245694b9ea33de4825b5afd4bc159721329407c3278485cad7808 not found: ID does not exist" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.390311 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0db1e776-c3a0-4046-bd24-45b7c409eac6-log-httpd\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.390407 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.391384 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.391568 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.391670 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-scripts\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.391828 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0db1e776-c3a0-4046-bd24-45b7c409eac6-run-httpd\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.391900 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8llsw\" (UniqueName: \"kubernetes.io/projected/0db1e776-c3a0-4046-bd24-45b7c409eac6-kube-api-access-8llsw\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.391940 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-config-data\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.493911 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0db1e776-c3a0-4046-bd24-45b7c409eac6-log-httpd\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.493992 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.494046 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.494116 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.494162 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-scripts\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.494217 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0db1e776-c3a0-4046-bd24-45b7c409eac6-run-httpd\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.494259 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8llsw\" (UniqueName: \"kubernetes.io/projected/0db1e776-c3a0-4046-bd24-45b7c409eac6-kube-api-access-8llsw\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.494284 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-config-data\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.494696 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0db1e776-c3a0-4046-bd24-45b7c409eac6-log-httpd\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.495005 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0db1e776-c3a0-4046-bd24-45b7c409eac6-run-httpd\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.499845 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.499896 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-scripts\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.500041 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.505393 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.506133 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-config-data\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.509698 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8llsw\" (UniqueName: \"kubernetes.io/projected/0db1e776-c3a0-4046-bd24-45b7c409eac6-kube-api-access-8llsw\") pod \"ceilometer-0\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.558096 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:20:48 crc kubenswrapper[4563]: I1124 09:20:48.952805 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:20:48 crc kubenswrapper[4563]: W1124 09:20:48.955859 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db1e776_c3a0_4046_bd24_45b7c409eac6.slice/crio-0f623c856568628c85fc14e89fedd203f2fc268384de0c5c6b61d78e6932dced WatchSource:0}: Error finding container 0f623c856568628c85fc14e89fedd203f2fc268384de0c5c6b61d78e6932dced: Status 404 returned error can't find the container with id 0f623c856568628c85fc14e89fedd203f2fc268384de0c5c6b61d78e6932dced Nov 24 09:20:49 crc kubenswrapper[4563]: I1124 09:20:49.063651 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b" path="/var/lib/kubelet/pods/ea1e73ba-dc1f-41fe-a8c4-91c8a5e2ea6b/volumes" Nov 24 09:20:49 crc kubenswrapper[4563]: I1124 09:20:49.188018 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0db1e776-c3a0-4046-bd24-45b7c409eac6","Type":"ContainerStarted","Data":"0f623c856568628c85fc14e89fedd203f2fc268384de0c5c6b61d78e6932dced"} Nov 24 09:20:49 crc kubenswrapper[4563]: I1124 09:20:49.317674 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 24 09:20:49 crc kubenswrapper[4563]: I1124 09:20:49.319770 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 24 09:20:49 crc kubenswrapper[4563]: I1124 09:20:49.323243 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 24 09:20:50 crc kubenswrapper[4563]: I1124 09:20:50.198577 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0db1e776-c3a0-4046-bd24-45b7c409eac6","Type":"ContainerStarted","Data":"6c4cbf24f8c85a11cd542c9a08818322d0e2d6c5c0380b75889f02336f838efb"} Nov 24 09:20:50 crc kubenswrapper[4563]: I1124 09:20:50.206024 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 24 09:20:51 crc kubenswrapper[4563]: I1124 09:20:51.215919 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0db1e776-c3a0-4046-bd24-45b7c409eac6","Type":"ContainerStarted","Data":"907d0409e523cd69b2e93922950d57e921d003720ded4c9a70a84fe0276ff045"} Nov 24 09:20:51 crc kubenswrapper[4563]: I1124 09:20:51.216510 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0db1e776-c3a0-4046-bd24-45b7c409eac6","Type":"ContainerStarted","Data":"c3cb6d371d6b3566da9135ae07912acc06fe4a188b3ee1b6ee78ee1e8b1b66c1"} Nov 24 09:20:52 crc kubenswrapper[4563]: I1124 09:20:52.279248 4563 generic.go:334] "Generic (PLEG): container finished" podID="f84ace12-fa17-4fc7-8bf0-771e8273eb55" containerID="daf4829fa7ff43fc586194d9c583f00284566907b2991030a438f4e5c5c3ebaf" exitCode=137 Nov 24 09:20:52 crc kubenswrapper[4563]: I1124 09:20:52.279323 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f84ace12-fa17-4fc7-8bf0-771e8273eb55","Type":"ContainerDied","Data":"daf4829fa7ff43fc586194d9c583f00284566907b2991030a438f4e5c5c3ebaf"} Nov 24 09:20:52 crc kubenswrapper[4563]: I1124 09:20:52.279528 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f84ace12-fa17-4fc7-8bf0-771e8273eb55","Type":"ContainerDied","Data":"3892549bbf939ad51b847b69ba38192a6655def6832c2b14ebfd24b405d929c7"} Nov 24 09:20:52 crc kubenswrapper[4563]: I1124 09:20:52.279544 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3892549bbf939ad51b847b69ba38192a6655def6832c2b14ebfd24b405d929c7" Nov 24 09:20:52 crc kubenswrapper[4563]: I1124 09:20:52.344665 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:52 crc kubenswrapper[4563]: I1124 09:20:52.490757 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f84ace12-fa17-4fc7-8bf0-771e8273eb55-config-data\") pod \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\" (UID: \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\") " Nov 24 09:20:52 crc kubenswrapper[4563]: I1124 09:20:52.490826 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r8xg\" (UniqueName: \"kubernetes.io/projected/f84ace12-fa17-4fc7-8bf0-771e8273eb55-kube-api-access-5r8xg\") pod \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\" (UID: \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\") " Nov 24 09:20:52 crc kubenswrapper[4563]: I1124 09:20:52.491090 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84ace12-fa17-4fc7-8bf0-771e8273eb55-combined-ca-bundle\") pod \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\" (UID: \"f84ace12-fa17-4fc7-8bf0-771e8273eb55\") " Nov 24 09:20:52 crc kubenswrapper[4563]: I1124 09:20:52.498182 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f84ace12-fa17-4fc7-8bf0-771e8273eb55-kube-api-access-5r8xg" (OuterVolumeSpecName: "kube-api-access-5r8xg") pod "f84ace12-fa17-4fc7-8bf0-771e8273eb55" (UID: "f84ace12-fa17-4fc7-8bf0-771e8273eb55"). InnerVolumeSpecName "kube-api-access-5r8xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:20:52 crc kubenswrapper[4563]: I1124 09:20:52.516136 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f84ace12-fa17-4fc7-8bf0-771e8273eb55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f84ace12-fa17-4fc7-8bf0-771e8273eb55" (UID: "f84ace12-fa17-4fc7-8bf0-771e8273eb55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:52 crc kubenswrapper[4563]: I1124 09:20:52.517441 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f84ace12-fa17-4fc7-8bf0-771e8273eb55-config-data" (OuterVolumeSpecName: "config-data") pod "f84ace12-fa17-4fc7-8bf0-771e8273eb55" (UID: "f84ace12-fa17-4fc7-8bf0-771e8273eb55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:52 crc kubenswrapper[4563]: I1124 09:20:52.593657 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f84ace12-fa17-4fc7-8bf0-771e8273eb55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:52 crc kubenswrapper[4563]: I1124 09:20:52.593692 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f84ace12-fa17-4fc7-8bf0-771e8273eb55-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:52 crc kubenswrapper[4563]: I1124 09:20:52.593705 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r8xg\" (UniqueName: \"kubernetes.io/projected/f84ace12-fa17-4fc7-8bf0-771e8273eb55-kube-api-access-5r8xg\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.289187 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.290580 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0db1e776-c3a0-4046-bd24-45b7c409eac6","Type":"ContainerStarted","Data":"49da0c412c7439eb62b15d892b4e35f7c450a82f7010eff1b212a811038a05ff"} Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.290613 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.308033 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.038445379 podStartE2EDuration="5.308016625s" podCreationTimestamp="2025-11-24 09:20:48 +0000 UTC" firstStartedPulling="2025-11-24 09:20:48.959830046 +0000 UTC m=+1026.218807493" lastFinishedPulling="2025-11-24 09:20:52.229401291 +0000 UTC m=+1029.488378739" observedRunningTime="2025-11-24 09:20:53.307280056 +0000 UTC m=+1030.566257503" watchObservedRunningTime="2025-11-24 09:20:53.308016625 +0000 UTC m=+1030.566994072" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.320631 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.331309 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.346083 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 09:20:53 crc kubenswrapper[4563]: E1124 09:20:53.346576 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84ace12-fa17-4fc7-8bf0-771e8273eb55" containerName="nova-cell1-novncproxy-novncproxy" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.346597 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84ace12-fa17-4fc7-8bf0-771e8273eb55" containerName="nova-cell1-novncproxy-novncproxy" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.346873 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="f84ace12-fa17-4fc7-8bf0-771e8273eb55" containerName="nova-cell1-novncproxy-novncproxy" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.348182 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.351560 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.352073 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.352304 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.357059 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.512856 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b68912-1886-4162-88c8-02a37d34c54a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.513595 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b68912-1886-4162-88c8-02a37d34c54a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.513999 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b68912-1886-4162-88c8-02a37d34c54a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.514122 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b68912-1886-4162-88c8-02a37d34c54a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.514221 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k8j2\" (UniqueName: \"kubernetes.io/projected/15b68912-1886-4162-88c8-02a37d34c54a-kube-api-access-6k8j2\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.616933 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b68912-1886-4162-88c8-02a37d34c54a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.616990 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b68912-1886-4162-88c8-02a37d34c54a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.617092 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b68912-1886-4162-88c8-02a37d34c54a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.617115 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b68912-1886-4162-88c8-02a37d34c54a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.617208 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k8j2\" (UniqueName: \"kubernetes.io/projected/15b68912-1886-4162-88c8-02a37d34c54a-kube-api-access-6k8j2\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.628159 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15b68912-1886-4162-88c8-02a37d34c54a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.628531 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b68912-1886-4162-88c8-02a37d34c54a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.635169 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k8j2\" (UniqueName: \"kubernetes.io/projected/15b68912-1886-4162-88c8-02a37d34c54a-kube-api-access-6k8j2\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.635233 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15b68912-1886-4162-88c8-02a37d34c54a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.639319 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/15b68912-1886-4162-88c8-02a37d34c54a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"15b68912-1886-4162-88c8-02a37d34c54a\") " pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:53 crc kubenswrapper[4563]: I1124 09:20:53.663947 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:54 crc kubenswrapper[4563]: I1124 09:20:54.077388 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 24 09:20:54 crc kubenswrapper[4563]: I1124 09:20:54.315725 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"15b68912-1886-4162-88c8-02a37d34c54a","Type":"ContainerStarted","Data":"b9e9136a603b1bb04a8ac3faca6cec1f63451e7d3d59ba546b07a39e6620bab7"} Nov 24 09:20:54 crc kubenswrapper[4563]: I1124 09:20:54.316201 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"15b68912-1886-4162-88c8-02a37d34c54a","Type":"ContainerStarted","Data":"aaa84ade426d11f50cb810f8a1a1cbc69b3d821eb8204e943349347b973bf367"} Nov 24 09:20:54 crc kubenswrapper[4563]: I1124 09:20:54.335728 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.335717133 podStartE2EDuration="1.335717133s" podCreationTimestamp="2025-11-24 09:20:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:20:54.33142093 +0000 UTC m=+1031.590398377" watchObservedRunningTime="2025-11-24 09:20:54.335717133 +0000 UTC m=+1031.594694580" Nov 24 09:20:54 crc kubenswrapper[4563]: I1124 09:20:54.360742 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 24 09:20:54 crc kubenswrapper[4563]: I1124 09:20:54.361182 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 24 09:20:54 crc kubenswrapper[4563]: I1124 09:20:54.361774 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 24 09:20:54 crc kubenswrapper[4563]: I1124 09:20:54.367875 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 24 09:20:54 crc kubenswrapper[4563]: I1124 09:20:54.520561 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.078074 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f84ace12-fa17-4fc7-8bf0-771e8273eb55" path="/var/lib/kubelet/pods/f84ace12-fa17-4fc7-8bf0-771e8273eb55/volumes" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.323942 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.386119 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.538836 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7f54fb65-2xzrp"] Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.540425 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.566172 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7f54fb65-2xzrp"] Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.663202 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-dns-svc\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.663251 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nkt7\" (UniqueName: \"kubernetes.io/projected/30ae2d40-0434-41f0-8dcc-f5e67063a428-kube-api-access-7nkt7\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.663285 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-ovsdbserver-nb\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.663916 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-dns-swift-storage-0\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.663989 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-ovsdbserver-sb\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.664115 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-config\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.767180 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-ovsdbserver-sb\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.767258 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-config\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.767353 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-dns-svc\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.767387 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nkt7\" (UniqueName: \"kubernetes.io/projected/30ae2d40-0434-41f0-8dcc-f5e67063a428-kube-api-access-7nkt7\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.767420 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-ovsdbserver-nb\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.767487 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-dns-swift-storage-0\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.768733 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-dns-svc\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.768758 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-dns-swift-storage-0\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.768773 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-ovsdbserver-sb\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.768837 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-ovsdbserver-nb\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.769010 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-config\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.787267 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nkt7\" (UniqueName: \"kubernetes.io/projected/30ae2d40-0434-41f0-8dcc-f5e67063a428-kube-api-access-7nkt7\") pod \"dnsmasq-dns-5d7f54fb65-2xzrp\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:55 crc kubenswrapper[4563]: I1124 09:20:55.874395 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:56 crc kubenswrapper[4563]: I1124 09:20:56.301069 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7f54fb65-2xzrp"] Nov 24 09:20:56 crc kubenswrapper[4563]: W1124 09:20:56.306632 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30ae2d40_0434_41f0_8dcc_f5e67063a428.slice/crio-167ffbc96580c688c2fd3a06c4bcfe5296fc6da6f6a50fa8a9e124246f1a7a17 WatchSource:0}: Error finding container 167ffbc96580c688c2fd3a06c4bcfe5296fc6da6f6a50fa8a9e124246f1a7a17: Status 404 returned error can't find the container with id 167ffbc96580c688c2fd3a06c4bcfe5296fc6da6f6a50fa8a9e124246f1a7a17 Nov 24 09:20:56 crc kubenswrapper[4563]: I1124 09:20:56.334683 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" event={"ID":"30ae2d40-0434-41f0-8dcc-f5e67063a428","Type":"ContainerStarted","Data":"167ffbc96580c688c2fd3a06c4bcfe5296fc6da6f6a50fa8a9e124246f1a7a17"} Nov 24 09:20:57 crc kubenswrapper[4563]: I1124 09:20:57.348081 4563 generic.go:334] "Generic (PLEG): container finished" podID="30ae2d40-0434-41f0-8dcc-f5e67063a428" containerID="fab1e793dd14f0e6449009b3d828296a7e058bb38a079b344e74f0e2b72b702b" exitCode=0 Nov 24 09:20:57 crc kubenswrapper[4563]: I1124 09:20:57.348162 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" event={"ID":"30ae2d40-0434-41f0-8dcc-f5e67063a428","Type":"ContainerDied","Data":"fab1e793dd14f0e6449009b3d828296a7e058bb38a079b344e74f0e2b72b702b"} Nov 24 09:20:57 crc kubenswrapper[4563]: I1124 09:20:57.687134 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.116970 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.117196 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="ceilometer-central-agent" containerID="cri-o://6c4cbf24f8c85a11cd542c9a08818322d0e2d6c5c0380b75889f02336f838efb" gracePeriod=30 Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.117331 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="proxy-httpd" containerID="cri-o://49da0c412c7439eb62b15d892b4e35f7c450a82f7010eff1b212a811038a05ff" gracePeriod=30 Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.117444 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="ceilometer-notification-agent" containerID="cri-o://c3cb6d371d6b3566da9135ae07912acc06fe4a188b3ee1b6ee78ee1e8b1b66c1" gracePeriod=30 Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.117468 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="sg-core" containerID="cri-o://907d0409e523cd69b2e93922950d57e921d003720ded4c9a70a84fe0276ff045" gracePeriod=30 Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.364988 4563 generic.go:334] "Generic (PLEG): container finished" podID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerID="49da0c412c7439eb62b15d892b4e35f7c450a82f7010eff1b212a811038a05ff" exitCode=0 Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.365248 4563 generic.go:334] "Generic (PLEG): container finished" podID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerID="907d0409e523cd69b2e93922950d57e921d003720ded4c9a70a84fe0276ff045" exitCode=2 Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.365282 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0db1e776-c3a0-4046-bd24-45b7c409eac6","Type":"ContainerDied","Data":"49da0c412c7439eb62b15d892b4e35f7c450a82f7010eff1b212a811038a05ff"} Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.365306 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0db1e776-c3a0-4046-bd24-45b7c409eac6","Type":"ContainerDied","Data":"907d0409e523cd69b2e93922950d57e921d003720ded4c9a70a84fe0276ff045"} Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.367495 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" event={"ID":"30ae2d40-0434-41f0-8dcc-f5e67063a428","Type":"ContainerStarted","Data":"7c3e6205926990d76ca04675ed9f5ffc5ff948f981d63a7df9649ea1af2208b6"} Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.367572 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.367941 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="961502a3-4e3a-49f0-a8d1-67e462c289e5" containerName="nova-api-log" containerID="cri-o://deb1e86a08f2f8b30c21ff687cc39cdf3150e66713f4ac65fa23d2e7201552ed" gracePeriod=30 Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.367982 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="961502a3-4e3a-49f0-a8d1-67e462c289e5" containerName="nova-api-api" containerID="cri-o://3cbbf00b512a0046ad8a6f6e50bb7d228e29c81f5383c2f81b308c3a3d22e65b" gracePeriod=30 Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.403615 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" podStartSLOduration=3.403590721 podStartE2EDuration="3.403590721s" podCreationTimestamp="2025-11-24 09:20:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:20:58.393343444 +0000 UTC m=+1035.652320891" watchObservedRunningTime="2025-11-24 09:20:58.403590721 +0000 UTC m=+1035.662568168" Nov 24 09:20:58 crc kubenswrapper[4563]: I1124 09:20:58.665039 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.382740 4563 generic.go:334] "Generic (PLEG): container finished" podID="961502a3-4e3a-49f0-a8d1-67e462c289e5" containerID="deb1e86a08f2f8b30c21ff687cc39cdf3150e66713f4ac65fa23d2e7201552ed" exitCode=143 Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.383401 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"961502a3-4e3a-49f0-a8d1-67e462c289e5","Type":"ContainerDied","Data":"deb1e86a08f2f8b30c21ff687cc39cdf3150e66713f4ac65fa23d2e7201552ed"} Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.389774 4563 generic.go:334] "Generic (PLEG): container finished" podID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerID="c3cb6d371d6b3566da9135ae07912acc06fe4a188b3ee1b6ee78ee1e8b1b66c1" exitCode=0 Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.390115 4563 generic.go:334] "Generic (PLEG): container finished" podID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerID="6c4cbf24f8c85a11cd542c9a08818322d0e2d6c5c0380b75889f02336f838efb" exitCode=0 Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.391768 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0db1e776-c3a0-4046-bd24-45b7c409eac6","Type":"ContainerDied","Data":"c3cb6d371d6b3566da9135ae07912acc06fe4a188b3ee1b6ee78ee1e8b1b66c1"} Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.391813 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0db1e776-c3a0-4046-bd24-45b7c409eac6","Type":"ContainerDied","Data":"6c4cbf24f8c85a11cd542c9a08818322d0e2d6c5c0380b75889f02336f838efb"} Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.576223 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.760938 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-ceilometer-tls-certs\") pod \"0db1e776-c3a0-4046-bd24-45b7c409eac6\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.761049 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0db1e776-c3a0-4046-bd24-45b7c409eac6-log-httpd\") pod \"0db1e776-c3a0-4046-bd24-45b7c409eac6\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.761091 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8llsw\" (UniqueName: \"kubernetes.io/projected/0db1e776-c3a0-4046-bd24-45b7c409eac6-kube-api-access-8llsw\") pod \"0db1e776-c3a0-4046-bd24-45b7c409eac6\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.761149 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-combined-ca-bundle\") pod \"0db1e776-c3a0-4046-bd24-45b7c409eac6\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.761401 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-config-data\") pod \"0db1e776-c3a0-4046-bd24-45b7c409eac6\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.761522 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-sg-core-conf-yaml\") pod \"0db1e776-c3a0-4046-bd24-45b7c409eac6\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.761595 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0db1e776-c3a0-4046-bd24-45b7c409eac6-run-httpd\") pod \"0db1e776-c3a0-4046-bd24-45b7c409eac6\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.761658 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-scripts\") pod \"0db1e776-c3a0-4046-bd24-45b7c409eac6\" (UID: \"0db1e776-c3a0-4046-bd24-45b7c409eac6\") " Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.761688 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0db1e776-c3a0-4046-bd24-45b7c409eac6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0db1e776-c3a0-4046-bd24-45b7c409eac6" (UID: "0db1e776-c3a0-4046-bd24-45b7c409eac6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.762199 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0db1e776-c3a0-4046-bd24-45b7c409eac6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0db1e776-c3a0-4046-bd24-45b7c409eac6" (UID: "0db1e776-c3a0-4046-bd24-45b7c409eac6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.762899 4563 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0db1e776-c3a0-4046-bd24-45b7c409eac6-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.762921 4563 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0db1e776-c3a0-4046-bd24-45b7c409eac6-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.768747 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-scripts" (OuterVolumeSpecName: "scripts") pod "0db1e776-c3a0-4046-bd24-45b7c409eac6" (UID: "0db1e776-c3a0-4046-bd24-45b7c409eac6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.768783 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0db1e776-c3a0-4046-bd24-45b7c409eac6-kube-api-access-8llsw" (OuterVolumeSpecName: "kube-api-access-8llsw") pod "0db1e776-c3a0-4046-bd24-45b7c409eac6" (UID: "0db1e776-c3a0-4046-bd24-45b7c409eac6"). InnerVolumeSpecName "kube-api-access-8llsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.791627 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0db1e776-c3a0-4046-bd24-45b7c409eac6" (UID: "0db1e776-c3a0-4046-bd24-45b7c409eac6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.815520 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0db1e776-c3a0-4046-bd24-45b7c409eac6" (UID: "0db1e776-c3a0-4046-bd24-45b7c409eac6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.827653 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0db1e776-c3a0-4046-bd24-45b7c409eac6" (UID: "0db1e776-c3a0-4046-bd24-45b7c409eac6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.842272 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-config-data" (OuterVolumeSpecName: "config-data") pod "0db1e776-c3a0-4046-bd24-45b7c409eac6" (UID: "0db1e776-c3a0-4046-bd24-45b7c409eac6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.865823 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.865853 4563 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.865868 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.865880 4563 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.865891 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8llsw\" (UniqueName: \"kubernetes.io/projected/0db1e776-c3a0-4046-bd24-45b7c409eac6-kube-api-access-8llsw\") on node \"crc\" DevicePath \"\"" Nov 24 09:20:59 crc kubenswrapper[4563]: I1124 09:20:59.865904 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db1e776-c3a0-4046-bd24-45b7c409eac6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.402260 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0db1e776-c3a0-4046-bd24-45b7c409eac6","Type":"ContainerDied","Data":"0f623c856568628c85fc14e89fedd203f2fc268384de0c5c6b61d78e6932dced"} Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.402565 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.402602 4563 scope.go:117] "RemoveContainer" containerID="49da0c412c7439eb62b15d892b4e35f7c450a82f7010eff1b212a811038a05ff" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.424778 4563 scope.go:117] "RemoveContainer" containerID="907d0409e523cd69b2e93922950d57e921d003720ded4c9a70a84fe0276ff045" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.437943 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.446696 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.447332 4563 scope.go:117] "RemoveContainer" containerID="c3cb6d371d6b3566da9135ae07912acc06fe4a188b3ee1b6ee78ee1e8b1b66c1" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.467427 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:21:00 crc kubenswrapper[4563]: E1124 09:21:00.468838 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="ceilometer-notification-agent" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.468859 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="ceilometer-notification-agent" Nov 24 09:21:00 crc kubenswrapper[4563]: E1124 09:21:00.468880 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="proxy-httpd" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.468887 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="proxy-httpd" Nov 24 09:21:00 crc kubenswrapper[4563]: E1124 09:21:00.468919 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="ceilometer-central-agent" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.468925 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="ceilometer-central-agent" Nov 24 09:21:00 crc kubenswrapper[4563]: E1124 09:21:00.468936 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="sg-core" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.468942 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="sg-core" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.469111 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="sg-core" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.469130 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="ceilometer-notification-agent" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.469139 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="proxy-httpd" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.469152 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" containerName="ceilometer-central-agent" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.469818 4563 scope.go:117] "RemoveContainer" containerID="6c4cbf24f8c85a11cd542c9a08818322d0e2d6c5c0380b75889f02336f838efb" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.470891 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.475821 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.476391 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.477494 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.477712 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86acf291-2839-49f7-aaf3-33ba6e0cae2e-log-httpd\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.477749 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-scripts\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.477905 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzftv\" (UniqueName: \"kubernetes.io/projected/86acf291-2839-49f7-aaf3-33ba6e0cae2e-kube-api-access-fzftv\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.477949 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.477984 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.478079 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86acf291-2839-49f7-aaf3-33ba6e0cae2e-run-httpd\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.478123 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-config-data\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.478194 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.478258 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.581118 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.581227 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86acf291-2839-49f7-aaf3-33ba6e0cae2e-run-httpd\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.581261 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-config-data\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.581361 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.581517 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86acf291-2839-49f7-aaf3-33ba6e0cae2e-log-httpd\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.581548 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-scripts\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.582101 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86acf291-2839-49f7-aaf3-33ba6e0cae2e-log-httpd\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.582104 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86acf291-2839-49f7-aaf3-33ba6e0cae2e-run-httpd\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.582548 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzftv\" (UniqueName: \"kubernetes.io/projected/86acf291-2839-49f7-aaf3-33ba6e0cae2e-kube-api-access-fzftv\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.582716 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.586565 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.586828 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-config-data\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.587214 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.588009 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-scripts\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.588913 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86acf291-2839-49f7-aaf3-33ba6e0cae2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.597030 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzftv\" (UniqueName: \"kubernetes.io/projected/86acf291-2839-49f7-aaf3-33ba6e0cae2e-kube-api-access-fzftv\") pod \"ceilometer-0\" (UID: \"86acf291-2839-49f7-aaf3-33ba6e0cae2e\") " pod="openstack/ceilometer-0" Nov 24 09:21:00 crc kubenswrapper[4563]: I1124 09:21:00.784874 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 24 09:21:01 crc kubenswrapper[4563]: I1124 09:21:01.116218 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0db1e776-c3a0-4046-bd24-45b7c409eac6" path="/var/lib/kubelet/pods/0db1e776-c3a0-4046-bd24-45b7c409eac6/volumes" Nov 24 09:21:01 crc kubenswrapper[4563]: I1124 09:21:01.238832 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 24 09:21:01 crc kubenswrapper[4563]: W1124 09:21:01.245890 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86acf291_2839_49f7_aaf3_33ba6e0cae2e.slice/crio-b58724040aa891d95a1615e33000e05ec9dd1c0ae292b52e07e6c8e77f517c88 WatchSource:0}: Error finding container b58724040aa891d95a1615e33000e05ec9dd1c0ae292b52e07e6c8e77f517c88: Status 404 returned error can't find the container with id b58724040aa891d95a1615e33000e05ec9dd1c0ae292b52e07e6c8e77f517c88 Nov 24 09:21:01 crc kubenswrapper[4563]: I1124 09:21:01.412938 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86acf291-2839-49f7-aaf3-33ba6e0cae2e","Type":"ContainerStarted","Data":"b58724040aa891d95a1615e33000e05ec9dd1c0ae292b52e07e6c8e77f517c88"} Nov 24 09:21:01 crc kubenswrapper[4563]: I1124 09:21:01.848569 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 09:21:01 crc kubenswrapper[4563]: I1124 09:21:01.914667 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/961502a3-4e3a-49f0-a8d1-67e462c289e5-logs\") pod \"961502a3-4e3a-49f0-a8d1-67e462c289e5\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " Nov 24 09:21:01 crc kubenswrapper[4563]: I1124 09:21:01.914724 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/961502a3-4e3a-49f0-a8d1-67e462c289e5-combined-ca-bundle\") pod \"961502a3-4e3a-49f0-a8d1-67e462c289e5\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " Nov 24 09:21:01 crc kubenswrapper[4563]: I1124 09:21:01.914957 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/961502a3-4e3a-49f0-a8d1-67e462c289e5-config-data\") pod \"961502a3-4e3a-49f0-a8d1-67e462c289e5\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " Nov 24 09:21:01 crc kubenswrapper[4563]: I1124 09:21:01.915035 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsd8w\" (UniqueName: \"kubernetes.io/projected/961502a3-4e3a-49f0-a8d1-67e462c289e5-kube-api-access-lsd8w\") pod \"961502a3-4e3a-49f0-a8d1-67e462c289e5\" (UID: \"961502a3-4e3a-49f0-a8d1-67e462c289e5\") " Nov 24 09:21:01 crc kubenswrapper[4563]: I1124 09:21:01.916791 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/961502a3-4e3a-49f0-a8d1-67e462c289e5-logs" (OuterVolumeSpecName: "logs") pod "961502a3-4e3a-49f0-a8d1-67e462c289e5" (UID: "961502a3-4e3a-49f0-a8d1-67e462c289e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:21:01 crc kubenswrapper[4563]: I1124 09:21:01.920441 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/961502a3-4e3a-49f0-a8d1-67e462c289e5-kube-api-access-lsd8w" (OuterVolumeSpecName: "kube-api-access-lsd8w") pod "961502a3-4e3a-49f0-a8d1-67e462c289e5" (UID: "961502a3-4e3a-49f0-a8d1-67e462c289e5"). InnerVolumeSpecName "kube-api-access-lsd8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:21:01 crc kubenswrapper[4563]: I1124 09:21:01.942713 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/961502a3-4e3a-49f0-a8d1-67e462c289e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "961502a3-4e3a-49f0-a8d1-67e462c289e5" (UID: "961502a3-4e3a-49f0-a8d1-67e462c289e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:01 crc kubenswrapper[4563]: I1124 09:21:01.942847 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/961502a3-4e3a-49f0-a8d1-67e462c289e5-config-data" (OuterVolumeSpecName: "config-data") pod "961502a3-4e3a-49f0-a8d1-67e462c289e5" (UID: "961502a3-4e3a-49f0-a8d1-67e462c289e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.018355 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/961502a3-4e3a-49f0-a8d1-67e462c289e5-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.018382 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/961502a3-4e3a-49f0-a8d1-67e462c289e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.018394 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/961502a3-4e3a-49f0-a8d1-67e462c289e5-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.018403 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsd8w\" (UniqueName: \"kubernetes.io/projected/961502a3-4e3a-49f0-a8d1-67e462c289e5-kube-api-access-lsd8w\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.432704 4563 generic.go:334] "Generic (PLEG): container finished" podID="961502a3-4e3a-49f0-a8d1-67e462c289e5" containerID="3cbbf00b512a0046ad8a6f6e50bb7d228e29c81f5383c2f81b308c3a3d22e65b" exitCode=0 Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.433161 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.434782 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"961502a3-4e3a-49f0-a8d1-67e462c289e5","Type":"ContainerDied","Data":"3cbbf00b512a0046ad8a6f6e50bb7d228e29c81f5383c2f81b308c3a3d22e65b"} Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.434857 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"961502a3-4e3a-49f0-a8d1-67e462c289e5","Type":"ContainerDied","Data":"57cdba65fe7e06a3047ec732ab2bf7d2b752c184d0c554dbb4c2a0396e6c15f3"} Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.434897 4563 scope.go:117] "RemoveContainer" containerID="3cbbf00b512a0046ad8a6f6e50bb7d228e29c81f5383c2f81b308c3a3d22e65b" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.441947 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86acf291-2839-49f7-aaf3-33ba6e0cae2e","Type":"ContainerStarted","Data":"0093ddfdd6d3195501bb0541992d1afe2efd19b891defcc9c362ed218d8f61cb"} Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.577570 4563 scope.go:117] "RemoveContainer" containerID="deb1e86a08f2f8b30c21ff687cc39cdf3150e66713f4ac65fa23d2e7201552ed" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.604859 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.611995 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.621322 4563 scope.go:117] "RemoveContainer" containerID="3cbbf00b512a0046ad8a6f6e50bb7d228e29c81f5383c2f81b308c3a3d22e65b" Nov 24 09:21:02 crc kubenswrapper[4563]: E1124 09:21:02.626113 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cbbf00b512a0046ad8a6f6e50bb7d228e29c81f5383c2f81b308c3a3d22e65b\": container with ID starting with 3cbbf00b512a0046ad8a6f6e50bb7d228e29c81f5383c2f81b308c3a3d22e65b not found: ID does not exist" containerID="3cbbf00b512a0046ad8a6f6e50bb7d228e29c81f5383c2f81b308c3a3d22e65b" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.626162 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cbbf00b512a0046ad8a6f6e50bb7d228e29c81f5383c2f81b308c3a3d22e65b"} err="failed to get container status \"3cbbf00b512a0046ad8a6f6e50bb7d228e29c81f5383c2f81b308c3a3d22e65b\": rpc error: code = NotFound desc = could not find container \"3cbbf00b512a0046ad8a6f6e50bb7d228e29c81f5383c2f81b308c3a3d22e65b\": container with ID starting with 3cbbf00b512a0046ad8a6f6e50bb7d228e29c81f5383c2f81b308c3a3d22e65b not found: ID does not exist" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.626190 4563 scope.go:117] "RemoveContainer" containerID="deb1e86a08f2f8b30c21ff687cc39cdf3150e66713f4ac65fa23d2e7201552ed" Nov 24 09:21:02 crc kubenswrapper[4563]: E1124 09:21:02.627025 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb1e86a08f2f8b30c21ff687cc39cdf3150e66713f4ac65fa23d2e7201552ed\": container with ID starting with deb1e86a08f2f8b30c21ff687cc39cdf3150e66713f4ac65fa23d2e7201552ed not found: ID does not exist" containerID="deb1e86a08f2f8b30c21ff687cc39cdf3150e66713f4ac65fa23d2e7201552ed" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.627067 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb1e86a08f2f8b30c21ff687cc39cdf3150e66713f4ac65fa23d2e7201552ed"} err="failed to get container status \"deb1e86a08f2f8b30c21ff687cc39cdf3150e66713f4ac65fa23d2e7201552ed\": rpc error: code = NotFound desc = could not find container \"deb1e86a08f2f8b30c21ff687cc39cdf3150e66713f4ac65fa23d2e7201552ed\": container with ID starting with deb1e86a08f2f8b30c21ff687cc39cdf3150e66713f4ac65fa23d2e7201552ed not found: ID does not exist" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.632502 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 24 09:21:02 crc kubenswrapper[4563]: E1124 09:21:02.632992 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="961502a3-4e3a-49f0-a8d1-67e462c289e5" containerName="nova-api-log" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.633005 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="961502a3-4e3a-49f0-a8d1-67e462c289e5" containerName="nova-api-log" Nov 24 09:21:02 crc kubenswrapper[4563]: E1124 09:21:02.633025 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="961502a3-4e3a-49f0-a8d1-67e462c289e5" containerName="nova-api-api" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.633031 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="961502a3-4e3a-49f0-a8d1-67e462c289e5" containerName="nova-api-api" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.633239 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="961502a3-4e3a-49f0-a8d1-67e462c289e5" containerName="nova-api-api" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.633265 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="961502a3-4e3a-49f0-a8d1-67e462c289e5" containerName="nova-api-log" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.634321 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.636752 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.637000 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.637086 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.641962 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.736611 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.736899 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d89c\" (UniqueName: \"kubernetes.io/projected/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-kube-api-access-6d89c\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.737161 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-logs\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.737449 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-config-data\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.737523 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.737554 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-public-tls-certs\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.840454 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.840583 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d89c\" (UniqueName: \"kubernetes.io/projected/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-kube-api-access-6d89c\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.840664 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-logs\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.840730 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-config-data\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.840764 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.840787 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-public-tls-certs\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.846397 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-logs\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.847673 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.848024 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-config-data\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.848323 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-public-tls-certs\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.853405 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.869102 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d89c\" (UniqueName: \"kubernetes.io/projected/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-kube-api-access-6d89c\") pod \"nova-api-0\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " pod="openstack/nova-api-0" Nov 24 09:21:02 crc kubenswrapper[4563]: I1124 09:21:02.960745 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 09:21:03 crc kubenswrapper[4563]: I1124 09:21:03.078545 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="961502a3-4e3a-49f0-a8d1-67e462c289e5" path="/var/lib/kubelet/pods/961502a3-4e3a-49f0-a8d1-67e462c289e5/volumes" Nov 24 09:21:03 crc kubenswrapper[4563]: I1124 09:21:03.389396 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:21:03 crc kubenswrapper[4563]: W1124 09:21:03.395622 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bc2cfa3_1ea6_4fa8_a9ae_80d2280891a3.slice/crio-6d93f0fa34437a670d4c2f9f06c29772e1390789ea73b7b8ccc2b747a57e48a4 WatchSource:0}: Error finding container 6d93f0fa34437a670d4c2f9f06c29772e1390789ea73b7b8ccc2b747a57e48a4: Status 404 returned error can't find the container with id 6d93f0fa34437a670d4c2f9f06c29772e1390789ea73b7b8ccc2b747a57e48a4 Nov 24 09:21:03 crc kubenswrapper[4563]: I1124 09:21:03.455242 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86acf291-2839-49f7-aaf3-33ba6e0cae2e","Type":"ContainerStarted","Data":"78a68be49b12b06d5513fab782a45a51fa67516d99ca9485f7820f8580429ce7"} Nov 24 09:21:03 crc kubenswrapper[4563]: I1124 09:21:03.457966 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3","Type":"ContainerStarted","Data":"6d93f0fa34437a670d4c2f9f06c29772e1390789ea73b7b8ccc2b747a57e48a4"} Nov 24 09:21:03 crc kubenswrapper[4563]: I1124 09:21:03.666000 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:21:03 crc kubenswrapper[4563]: I1124 09:21:03.689912 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.470281 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86acf291-2839-49f7-aaf3-33ba6e0cae2e","Type":"ContainerStarted","Data":"868f4bf2a43d7211acaf46550194d7e118f5baf8c41794ab3dc597e54ecfdc03"} Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.474527 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3","Type":"ContainerStarted","Data":"4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a"} Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.474683 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3","Type":"ContainerStarted","Data":"8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743"} Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.495528 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.499012 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.498993341 podStartE2EDuration="2.498993341s" podCreationTimestamp="2025-11-24 09:21:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:21:04.48811728 +0000 UTC m=+1041.747094727" watchObservedRunningTime="2025-11-24 09:21:04.498993341 +0000 UTC m=+1041.757970789" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.739850 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-94qxv"] Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.741352 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.744056 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.751035 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.753711 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-94qxv"] Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.889303 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-scripts\") pod \"nova-cell1-cell-mapping-94qxv\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.889499 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt8dz\" (UniqueName: \"kubernetes.io/projected/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-kube-api-access-wt8dz\") pod \"nova-cell1-cell-mapping-94qxv\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.889552 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-94qxv\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.889577 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-config-data\") pod \"nova-cell1-cell-mapping-94qxv\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.993685 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt8dz\" (UniqueName: \"kubernetes.io/projected/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-kube-api-access-wt8dz\") pod \"nova-cell1-cell-mapping-94qxv\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.993743 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-94qxv\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.993769 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-config-data\") pod \"nova-cell1-cell-mapping-94qxv\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.994220 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-scripts\") pod \"nova-cell1-cell-mapping-94qxv\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.999046 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-scripts\") pod \"nova-cell1-cell-mapping-94qxv\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:04 crc kubenswrapper[4563]: I1124 09:21:04.999370 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-config-data\") pod \"nova-cell1-cell-mapping-94qxv\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:05 crc kubenswrapper[4563]: I1124 09:21:05.001915 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-94qxv\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:05 crc kubenswrapper[4563]: I1124 09:21:05.008822 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt8dz\" (UniqueName: \"kubernetes.io/projected/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-kube-api-access-wt8dz\") pod \"nova-cell1-cell-mapping-94qxv\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:05 crc kubenswrapper[4563]: I1124 09:21:05.203578 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:05 crc kubenswrapper[4563]: I1124 09:21:05.485297 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86acf291-2839-49f7-aaf3-33ba6e0cae2e","Type":"ContainerStarted","Data":"4d3132028b9df19282985e2b93c9a86e8e9d52a5aa1163aa53d7a0dc3b8db82a"} Nov 24 09:21:05 crc kubenswrapper[4563]: I1124 09:21:05.513665 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.032075418 podStartE2EDuration="5.513623551s" podCreationTimestamp="2025-11-24 09:21:00 +0000 UTC" firstStartedPulling="2025-11-24 09:21:01.250723275 +0000 UTC m=+1038.509700722" lastFinishedPulling="2025-11-24 09:21:04.732271408 +0000 UTC m=+1041.991248855" observedRunningTime="2025-11-24 09:21:05.510099946 +0000 UTC m=+1042.769077393" watchObservedRunningTime="2025-11-24 09:21:05.513623551 +0000 UTC m=+1042.772600998" Nov 24 09:21:05 crc kubenswrapper[4563]: I1124 09:21:05.600431 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-94qxv"] Nov 24 09:21:05 crc kubenswrapper[4563]: I1124 09:21:05.876285 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:21:05 crc kubenswrapper[4563]: I1124 09:21:05.950477 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dd7c4987f-f758b"] Nov 24 09:21:05 crc kubenswrapper[4563]: I1124 09:21:05.950772 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" podUID="775efdde-5acb-4276-ab9d-bd4644541c9b" containerName="dnsmasq-dns" containerID="cri-o://fed3a1794add98608d53791e138de31c21dff80d920b1a07e1a6b7624c1047b9" gracePeriod=10 Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.431027 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.495778 4563 generic.go:334] "Generic (PLEG): container finished" podID="775efdde-5acb-4276-ab9d-bd4644541c9b" containerID="fed3a1794add98608d53791e138de31c21dff80d920b1a07e1a6b7624c1047b9" exitCode=0 Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.496549 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" event={"ID":"775efdde-5acb-4276-ab9d-bd4644541c9b","Type":"ContainerDied","Data":"fed3a1794add98608d53791e138de31c21dff80d920b1a07e1a6b7624c1047b9"} Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.496675 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" event={"ID":"775efdde-5acb-4276-ab9d-bd4644541c9b","Type":"ContainerDied","Data":"f2676ef731492b17eec6d39c5d8f59a39de58d23a5fead70323b3088b500d00b"} Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.496755 4563 scope.go:117] "RemoveContainer" containerID="fed3a1794add98608d53791e138de31c21dff80d920b1a07e1a6b7624c1047b9" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.496957 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dd7c4987f-f758b" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.500444 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-94qxv" event={"ID":"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8","Type":"ContainerStarted","Data":"a4aa16d5c3909ca3c0970da3ca2011671b929e433a1a535f46766005bcb45570"} Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.500559 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.500618 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-94qxv" event={"ID":"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8","Type":"ContainerStarted","Data":"1780c09ae75bb38271385ef0dc5847fdeb75cc578e2431c12a4ed0a904e432c1"} Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.529580 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-94qxv" podStartSLOduration=2.529559773 podStartE2EDuration="2.529559773s" podCreationTimestamp="2025-11-24 09:21:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:21:06.520518302 +0000 UTC m=+1043.779495749" watchObservedRunningTime="2025-11-24 09:21:06.529559773 +0000 UTC m=+1043.788537220" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.538923 4563 scope.go:117] "RemoveContainer" containerID="77403049f76b0bab1ca45388ca29a86c008855ecaac23a298ad8812919854023" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.577000 4563 scope.go:117] "RemoveContainer" containerID="fed3a1794add98608d53791e138de31c21dff80d920b1a07e1a6b7624c1047b9" Nov 24 09:21:06 crc kubenswrapper[4563]: E1124 09:21:06.577689 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fed3a1794add98608d53791e138de31c21dff80d920b1a07e1a6b7624c1047b9\": container with ID starting with fed3a1794add98608d53791e138de31c21dff80d920b1a07e1a6b7624c1047b9 not found: ID does not exist" containerID="fed3a1794add98608d53791e138de31c21dff80d920b1a07e1a6b7624c1047b9" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.577734 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fed3a1794add98608d53791e138de31c21dff80d920b1a07e1a6b7624c1047b9"} err="failed to get container status \"fed3a1794add98608d53791e138de31c21dff80d920b1a07e1a6b7624c1047b9\": rpc error: code = NotFound desc = could not find container \"fed3a1794add98608d53791e138de31c21dff80d920b1a07e1a6b7624c1047b9\": container with ID starting with fed3a1794add98608d53791e138de31c21dff80d920b1a07e1a6b7624c1047b9 not found: ID does not exist" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.577765 4563 scope.go:117] "RemoveContainer" containerID="77403049f76b0bab1ca45388ca29a86c008855ecaac23a298ad8812919854023" Nov 24 09:21:06 crc kubenswrapper[4563]: E1124 09:21:06.578080 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77403049f76b0bab1ca45388ca29a86c008855ecaac23a298ad8812919854023\": container with ID starting with 77403049f76b0bab1ca45388ca29a86c008855ecaac23a298ad8812919854023 not found: ID does not exist" containerID="77403049f76b0bab1ca45388ca29a86c008855ecaac23a298ad8812919854023" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.578119 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77403049f76b0bab1ca45388ca29a86c008855ecaac23a298ad8812919854023"} err="failed to get container status \"77403049f76b0bab1ca45388ca29a86c008855ecaac23a298ad8812919854023\": rpc error: code = NotFound desc = could not find container \"77403049f76b0bab1ca45388ca29a86c008855ecaac23a298ad8812919854023\": container with ID starting with 77403049f76b0bab1ca45388ca29a86c008855ecaac23a298ad8812919854023 not found: ID does not exist" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.631344 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6l7b\" (UniqueName: \"kubernetes.io/projected/775efdde-5acb-4276-ab9d-bd4644541c9b-kube-api-access-z6l7b\") pod \"775efdde-5acb-4276-ab9d-bd4644541c9b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.631418 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-ovsdbserver-sb\") pod \"775efdde-5acb-4276-ab9d-bd4644541c9b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.631530 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-dns-swift-storage-0\") pod \"775efdde-5acb-4276-ab9d-bd4644541c9b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.631549 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-ovsdbserver-nb\") pod \"775efdde-5acb-4276-ab9d-bd4644541c9b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.631720 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-dns-svc\") pod \"775efdde-5acb-4276-ab9d-bd4644541c9b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.631874 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-config\") pod \"775efdde-5acb-4276-ab9d-bd4644541c9b\" (UID: \"775efdde-5acb-4276-ab9d-bd4644541c9b\") " Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.636882 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/775efdde-5acb-4276-ab9d-bd4644541c9b-kube-api-access-z6l7b" (OuterVolumeSpecName: "kube-api-access-z6l7b") pod "775efdde-5acb-4276-ab9d-bd4644541c9b" (UID: "775efdde-5acb-4276-ab9d-bd4644541c9b"). InnerVolumeSpecName "kube-api-access-z6l7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.675465 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-config" (OuterVolumeSpecName: "config") pod "775efdde-5acb-4276-ab9d-bd4644541c9b" (UID: "775efdde-5acb-4276-ab9d-bd4644541c9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.681449 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "775efdde-5acb-4276-ab9d-bd4644541c9b" (UID: "775efdde-5acb-4276-ab9d-bd4644541c9b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.682513 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "775efdde-5acb-4276-ab9d-bd4644541c9b" (UID: "775efdde-5acb-4276-ab9d-bd4644541c9b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.694070 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "775efdde-5acb-4276-ab9d-bd4644541c9b" (UID: "775efdde-5acb-4276-ab9d-bd4644541c9b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.711409 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "775efdde-5acb-4276-ab9d-bd4644541c9b" (UID: "775efdde-5acb-4276-ab9d-bd4644541c9b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.734835 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.734861 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.734874 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6l7b\" (UniqueName: \"kubernetes.io/projected/775efdde-5acb-4276-ab9d-bd4644541c9b-kube-api-access-z6l7b\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.734886 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.734896 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.734905 4563 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/775efdde-5acb-4276-ab9d-bd4644541c9b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.821475 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dd7c4987f-f758b"] Nov 24 09:21:06 crc kubenswrapper[4563]: I1124 09:21:06.828479 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dd7c4987f-f758b"] Nov 24 09:21:07 crc kubenswrapper[4563]: I1124 09:21:07.064209 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="775efdde-5acb-4276-ab9d-bd4644541c9b" path="/var/lib/kubelet/pods/775efdde-5acb-4276-ab9d-bd4644541c9b/volumes" Nov 24 09:21:08 crc kubenswrapper[4563]: I1124 09:21:08.988046 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:21:08 crc kubenswrapper[4563]: I1124 09:21:08.988381 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:21:08 crc kubenswrapper[4563]: I1124 09:21:08.988440 4563 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:21:08 crc kubenswrapper[4563]: I1124 09:21:08.989434 4563 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"03f875f88eef557bff28f5ed0a9f361fdac2df81584f5d9cbef7e181ab4ba280"} pod="openshift-machine-config-operator/machine-config-daemon-stlxr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 09:21:08 crc kubenswrapper[4563]: I1124 09:21:08.989498 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" containerID="cri-o://03f875f88eef557bff28f5ed0a9f361fdac2df81584f5d9cbef7e181ab4ba280" gracePeriod=600 Nov 24 09:21:09 crc kubenswrapper[4563]: I1124 09:21:09.530178 4563 generic.go:334] "Generic (PLEG): container finished" podID="3b2bfe55-8989-49b3-bb61-e28189447627" containerID="03f875f88eef557bff28f5ed0a9f361fdac2df81584f5d9cbef7e181ab4ba280" exitCode=0 Nov 24 09:21:09 crc kubenswrapper[4563]: I1124 09:21:09.530266 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerDied","Data":"03f875f88eef557bff28f5ed0a9f361fdac2df81584f5d9cbef7e181ab4ba280"} Nov 24 09:21:09 crc kubenswrapper[4563]: I1124 09:21:09.530438 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"910d483568e80e7c5051043679fd4f1476c6c059619a02146f6b5a112281d18f"} Nov 24 09:21:09 crc kubenswrapper[4563]: I1124 09:21:09.530463 4563 scope.go:117] "RemoveContainer" containerID="6964b83d094213dce29e8bc08bcdab313730109f615c70e3b48ab3147ba318f2" Nov 24 09:21:10 crc kubenswrapper[4563]: I1124 09:21:10.542596 4563 generic.go:334] "Generic (PLEG): container finished" podID="f81c63b6-56cb-41c4-b2ad-8eb1d99419d8" containerID="a4aa16d5c3909ca3c0970da3ca2011671b929e433a1a535f46766005bcb45570" exitCode=0 Nov 24 09:21:10 crc kubenswrapper[4563]: I1124 09:21:10.542675 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-94qxv" event={"ID":"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8","Type":"ContainerDied","Data":"a4aa16d5c3909ca3c0970da3ca2011671b929e433a1a535f46766005bcb45570"} Nov 24 09:21:11 crc kubenswrapper[4563]: I1124 09:21:11.894138 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.051601 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-scripts\") pod \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.051730 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-combined-ca-bundle\") pod \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.051758 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-config-data\") pod \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.051831 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt8dz\" (UniqueName: \"kubernetes.io/projected/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-kube-api-access-wt8dz\") pod \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\" (UID: \"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8\") " Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.059761 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-scripts" (OuterVolumeSpecName: "scripts") pod "f81c63b6-56cb-41c4-b2ad-8eb1d99419d8" (UID: "f81c63b6-56cb-41c4-b2ad-8eb1d99419d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.073795 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-kube-api-access-wt8dz" (OuterVolumeSpecName: "kube-api-access-wt8dz") pod "f81c63b6-56cb-41c4-b2ad-8eb1d99419d8" (UID: "f81c63b6-56cb-41c4-b2ad-8eb1d99419d8"). InnerVolumeSpecName "kube-api-access-wt8dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.105865 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-config-data" (OuterVolumeSpecName: "config-data") pod "f81c63b6-56cb-41c4-b2ad-8eb1d99419d8" (UID: "f81c63b6-56cb-41c4-b2ad-8eb1d99419d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.109868 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f81c63b6-56cb-41c4-b2ad-8eb1d99419d8" (UID: "f81c63b6-56cb-41c4-b2ad-8eb1d99419d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.156969 4563 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-scripts\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.156997 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.157010 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.157021 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt8dz\" (UniqueName: \"kubernetes.io/projected/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8-kube-api-access-wt8dz\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.560957 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-94qxv" event={"ID":"f81c63b6-56cb-41c4-b2ad-8eb1d99419d8","Type":"ContainerDied","Data":"1780c09ae75bb38271385ef0dc5847fdeb75cc578e2431c12a4ed0a904e432c1"} Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.561426 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1780c09ae75bb38271385ef0dc5847fdeb75cc578e2431c12a4ed0a904e432c1" Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.561025 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-94qxv" Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.727537 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.727796 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5021efe4-f1d2-4762-a196-c2f9b4266ba9" containerName="nova-scheduler-scheduler" containerID="cri-o://43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7" gracePeriod=30 Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.735276 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.735582 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" containerName="nova-api-log" containerID="cri-o://8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743" gracePeriod=30 Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.735678 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" containerName="nova-api-api" containerID="cri-o://4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a" gracePeriod=30 Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.742139 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.742343 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0301f820-dbcb-4cad-a180-7aed80a46db6" containerName="nova-metadata-log" containerID="cri-o://8e45d102bfc00337918f022eba7fdc2ec81c58445a0f8a5b0918143e9e719448" gracePeriod=30 Nov 24 09:21:12 crc kubenswrapper[4563]: I1124 09:21:12.742391 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0301f820-dbcb-4cad-a180-7aed80a46db6" containerName="nova-metadata-metadata" containerID="cri-o://ed5c0d0d29cd0fdfbcabfda109f7a94e42f2acedf61384a52af79263053de4f5" gracePeriod=30 Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.269353 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.379197 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-internal-tls-certs\") pod \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.379296 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-config-data\") pod \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.379477 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d89c\" (UniqueName: \"kubernetes.io/projected/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-kube-api-access-6d89c\") pod \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.379553 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-logs\") pod \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.379777 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-public-tls-certs\") pod \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.379956 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-combined-ca-bundle\") pod \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.380212 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-logs" (OuterVolumeSpecName: "logs") pod "8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" (UID: "8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.386550 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-kube-api-access-6d89c" (OuterVolumeSpecName: "kube-api-access-6d89c") pod "8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" (UID: "8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3"). InnerVolumeSpecName "kube-api-access-6d89c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.413824 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" (UID: "8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.414709 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-config-data" (OuterVolumeSpecName: "config-data") pod "8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" (UID: "8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:13 crc kubenswrapper[4563]: E1124 09:21:13.433785 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-internal-tls-certs podName:8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3 nodeName:}" failed. No retries permitted until 2025-11-24 09:21:13.933755585 +0000 UTC m=+1051.192733033 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-internal-tls-certs") pod "8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" (UID: "8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3") : error deleting /var/lib/kubelet/pods/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3/volume-subpaths: remove /var/lib/kubelet/pods/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3/volume-subpaths: no such file or directory Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.436491 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" (UID: "8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.488558 4563 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.488584 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.488596 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.488607 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d89c\" (UniqueName: \"kubernetes.io/projected/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-kube-api-access-6d89c\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.488617 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.573230 4563 generic.go:334] "Generic (PLEG): container finished" podID="8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" containerID="4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a" exitCode=0 Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.573267 4563 generic.go:334] "Generic (PLEG): container finished" podID="8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" containerID="8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743" exitCode=143 Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.573326 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.573320 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3","Type":"ContainerDied","Data":"4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a"} Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.573411 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3","Type":"ContainerDied","Data":"8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743"} Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.573426 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3","Type":"ContainerDied","Data":"6d93f0fa34437a670d4c2f9f06c29772e1390789ea73b7b8ccc2b747a57e48a4"} Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.573453 4563 scope.go:117] "RemoveContainer" containerID="4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.579985 4563 generic.go:334] "Generic (PLEG): container finished" podID="0301f820-dbcb-4cad-a180-7aed80a46db6" containerID="8e45d102bfc00337918f022eba7fdc2ec81c58445a0f8a5b0918143e9e719448" exitCode=143 Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.580053 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0301f820-dbcb-4cad-a180-7aed80a46db6","Type":"ContainerDied","Data":"8e45d102bfc00337918f022eba7fdc2ec81c58445a0f8a5b0918143e9e719448"} Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.615068 4563 scope.go:117] "RemoveContainer" containerID="8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.632615 4563 scope.go:117] "RemoveContainer" containerID="4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a" Nov 24 09:21:13 crc kubenswrapper[4563]: E1124 09:21:13.632997 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a\": container with ID starting with 4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a not found: ID does not exist" containerID="4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.633041 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a"} err="failed to get container status \"4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a\": rpc error: code = NotFound desc = could not find container \"4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a\": container with ID starting with 4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a not found: ID does not exist" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.633067 4563 scope.go:117] "RemoveContainer" containerID="8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743" Nov 24 09:21:13 crc kubenswrapper[4563]: E1124 09:21:13.633298 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743\": container with ID starting with 8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743 not found: ID does not exist" containerID="8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.633322 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743"} err="failed to get container status \"8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743\": rpc error: code = NotFound desc = could not find container \"8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743\": container with ID starting with 8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743 not found: ID does not exist" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.633340 4563 scope.go:117] "RemoveContainer" containerID="4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.633593 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a"} err="failed to get container status \"4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a\": rpc error: code = NotFound desc = could not find container \"4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a\": container with ID starting with 4172ada8a05efc2a60eeabed50778fe4acc63311db4765927d67a45967146a8a not found: ID does not exist" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.633609 4563 scope.go:117] "RemoveContainer" containerID="8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.633914 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743"} err="failed to get container status \"8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743\": rpc error: code = NotFound desc = could not find container \"8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743\": container with ID starting with 8f2b3b9ef627837848f57d6083de9192844fcd27251c91725168610997ca5743 not found: ID does not exist" Nov 24 09:21:13 crc kubenswrapper[4563]: E1124 09:21:13.667013 4563 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 09:21:13 crc kubenswrapper[4563]: E1124 09:21:13.673735 4563 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 09:21:13 crc kubenswrapper[4563]: E1124 09:21:13.676121 4563 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 24 09:21:13 crc kubenswrapper[4563]: E1124 09:21:13.676177 4563 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5021efe4-f1d2-4762-a196-c2f9b4266ba9" containerName="nova-scheduler-scheduler" Nov 24 09:21:13 crc kubenswrapper[4563]: I1124 09:21:13.996243 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-internal-tls-certs\") pod \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\" (UID: \"8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3\") " Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.000355 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" (UID: "8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.098827 4563 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.201547 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.207002 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.212189 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 24 09:21:14 crc kubenswrapper[4563]: E1124 09:21:14.212589 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81c63b6-56cb-41c4-b2ad-8eb1d99419d8" containerName="nova-manage" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.212608 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81c63b6-56cb-41c4-b2ad-8eb1d99419d8" containerName="nova-manage" Nov 24 09:21:14 crc kubenswrapper[4563]: E1124 09:21:14.212626 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" containerName="nova-api-api" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.212633 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" containerName="nova-api-api" Nov 24 09:21:14 crc kubenswrapper[4563]: E1124 09:21:14.212667 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" containerName="nova-api-log" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.212672 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" containerName="nova-api-log" Nov 24 09:21:14 crc kubenswrapper[4563]: E1124 09:21:14.212684 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="775efdde-5acb-4276-ab9d-bd4644541c9b" containerName="dnsmasq-dns" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.212689 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="775efdde-5acb-4276-ab9d-bd4644541c9b" containerName="dnsmasq-dns" Nov 24 09:21:14 crc kubenswrapper[4563]: E1124 09:21:14.212707 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="775efdde-5acb-4276-ab9d-bd4644541c9b" containerName="init" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.212714 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="775efdde-5acb-4276-ab9d-bd4644541c9b" containerName="init" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.212876 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" containerName="nova-api-log" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.212891 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="f81c63b6-56cb-41c4-b2ad-8eb1d99419d8" containerName="nova-manage" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.212908 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="775efdde-5acb-4276-ab9d-bd4644541c9b" containerName="dnsmasq-dns" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.212918 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" containerName="nova-api-api" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.213838 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.215410 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.215607 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.216040 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.228374 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.301985 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-public-tls-certs\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.302277 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.302507 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-logs\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.302658 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdktj\" (UniqueName: \"kubernetes.io/projected/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-kube-api-access-fdktj\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.302700 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-config-data\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.302751 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.406061 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.406207 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-logs\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.406297 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdktj\" (UniqueName: \"kubernetes.io/projected/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-kube-api-access-fdktj\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.406328 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-config-data\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.406373 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.406414 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-public-tls-certs\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.412195 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-logs\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.412537 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-public-tls-certs\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.416350 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.416420 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.416474 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-config-data\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.426271 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdktj\" (UniqueName: \"kubernetes.io/projected/3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6-kube-api-access-fdktj\") pod \"nova-api-0\" (UID: \"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6\") " pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.545307 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 24 09:21:14 crc kubenswrapper[4563]: I1124 09:21:14.968606 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 24 09:21:14 crc kubenswrapper[4563]: W1124 09:21:14.971985 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3da51b3b_6286_4cb5_bcc7_d715eb2fe2a6.slice/crio-a1bdfb732f098864179960c2c81fbd00eb6262f2809b338da57717e647086b27 WatchSource:0}: Error finding container a1bdfb732f098864179960c2c81fbd00eb6262f2809b338da57717e647086b27: Status 404 returned error can't find the container with id a1bdfb732f098864179960c2c81fbd00eb6262f2809b338da57717e647086b27 Nov 24 09:21:15 crc kubenswrapper[4563]: I1124 09:21:15.064466 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3" path="/var/lib/kubelet/pods/8bc2cfa3-1ea6-4fa8-a9ae-80d2280891a3/volumes" Nov 24 09:21:15 crc kubenswrapper[4563]: I1124 09:21:15.616858 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6","Type":"ContainerStarted","Data":"79f4d6dc1809aa2556a6eeb8b772894c915d1f9468087c6b487f3fdf2b6f0a2c"} Nov 24 09:21:15 crc kubenswrapper[4563]: I1124 09:21:15.617150 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6","Type":"ContainerStarted","Data":"a0aee6fc6b4a584b9529ca85473fa5958d502228e5e0cff19eb94c08b31d152a"} Nov 24 09:21:15 crc kubenswrapper[4563]: I1124 09:21:15.617162 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6","Type":"ContainerStarted","Data":"a1bdfb732f098864179960c2c81fbd00eb6262f2809b338da57717e647086b27"} Nov 24 09:21:15 crc kubenswrapper[4563]: I1124 09:21:15.634994 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.634979037 podStartE2EDuration="1.634979037s" podCreationTimestamp="2025-11-24 09:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:21:15.630853086 +0000 UTC m=+1052.889830532" watchObservedRunningTime="2025-11-24 09:21:15.634979037 +0000 UTC m=+1052.893956484" Nov 24 09:21:15 crc kubenswrapper[4563]: I1124 09:21:15.873449 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="0301f820-dbcb-4cad-a180-7aed80a46db6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": read tcp 10.217.0.2:47618->10.217.0.198:8775: read: connection reset by peer" Nov 24 09:21:15 crc kubenswrapper[4563]: I1124 09:21:15.873979 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="0301f820-dbcb-4cad-a180-7aed80a46db6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": read tcp 10.217.0.2:47626->10.217.0.198:8775: read: connection reset by peer" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.248718 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.444800 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm4dx\" (UniqueName: \"kubernetes.io/projected/0301f820-dbcb-4cad-a180-7aed80a46db6-kube-api-access-mm4dx\") pod \"0301f820-dbcb-4cad-a180-7aed80a46db6\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.444954 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-config-data\") pod \"0301f820-dbcb-4cad-a180-7aed80a46db6\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.445012 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-nova-metadata-tls-certs\") pod \"0301f820-dbcb-4cad-a180-7aed80a46db6\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.445044 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-combined-ca-bundle\") pod \"0301f820-dbcb-4cad-a180-7aed80a46db6\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.445144 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0301f820-dbcb-4cad-a180-7aed80a46db6-logs\") pod \"0301f820-dbcb-4cad-a180-7aed80a46db6\" (UID: \"0301f820-dbcb-4cad-a180-7aed80a46db6\") " Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.445741 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0301f820-dbcb-4cad-a180-7aed80a46db6-logs" (OuterVolumeSpecName: "logs") pod "0301f820-dbcb-4cad-a180-7aed80a46db6" (UID: "0301f820-dbcb-4cad-a180-7aed80a46db6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.451231 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0301f820-dbcb-4cad-a180-7aed80a46db6-kube-api-access-mm4dx" (OuterVolumeSpecName: "kube-api-access-mm4dx") pod "0301f820-dbcb-4cad-a180-7aed80a46db6" (UID: "0301f820-dbcb-4cad-a180-7aed80a46db6"). InnerVolumeSpecName "kube-api-access-mm4dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.471222 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0301f820-dbcb-4cad-a180-7aed80a46db6" (UID: "0301f820-dbcb-4cad-a180-7aed80a46db6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.479941 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-config-data" (OuterVolumeSpecName: "config-data") pod "0301f820-dbcb-4cad-a180-7aed80a46db6" (UID: "0301f820-dbcb-4cad-a180-7aed80a46db6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.495862 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0301f820-dbcb-4cad-a180-7aed80a46db6" (UID: "0301f820-dbcb-4cad-a180-7aed80a46db6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.550508 4563 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0301f820-dbcb-4cad-a180-7aed80a46db6-logs\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.550545 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm4dx\" (UniqueName: \"kubernetes.io/projected/0301f820-dbcb-4cad-a180-7aed80a46db6-kube-api-access-mm4dx\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.550560 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.550573 4563 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.550585 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0301f820-dbcb-4cad-a180-7aed80a46db6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.627327 4563 generic.go:334] "Generic (PLEG): container finished" podID="0301f820-dbcb-4cad-a180-7aed80a46db6" containerID="ed5c0d0d29cd0fdfbcabfda109f7a94e42f2acedf61384a52af79263053de4f5" exitCode=0 Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.627380 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0301f820-dbcb-4cad-a180-7aed80a46db6","Type":"ContainerDied","Data":"ed5c0d0d29cd0fdfbcabfda109f7a94e42f2acedf61384a52af79263053de4f5"} Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.627452 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0301f820-dbcb-4cad-a180-7aed80a46db6","Type":"ContainerDied","Data":"b093feded9204183db51b34303c50631cc5585b3e2fc47e5dd990f2c21cae7d1"} Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.627474 4563 scope.go:117] "RemoveContainer" containerID="ed5c0d0d29cd0fdfbcabfda109f7a94e42f2acedf61384a52af79263053de4f5" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.627404 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.652405 4563 scope.go:117] "RemoveContainer" containerID="8e45d102bfc00337918f022eba7fdc2ec81c58445a0f8a5b0918143e9e719448" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.667685 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.680359 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.686818 4563 scope.go:117] "RemoveContainer" containerID="ed5c0d0d29cd0fdfbcabfda109f7a94e42f2acedf61384a52af79263053de4f5" Nov 24 09:21:16 crc kubenswrapper[4563]: E1124 09:21:16.687210 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed5c0d0d29cd0fdfbcabfda109f7a94e42f2acedf61384a52af79263053de4f5\": container with ID starting with ed5c0d0d29cd0fdfbcabfda109f7a94e42f2acedf61384a52af79263053de4f5 not found: ID does not exist" containerID="ed5c0d0d29cd0fdfbcabfda109f7a94e42f2acedf61384a52af79263053de4f5" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.687248 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5c0d0d29cd0fdfbcabfda109f7a94e42f2acedf61384a52af79263053de4f5"} err="failed to get container status \"ed5c0d0d29cd0fdfbcabfda109f7a94e42f2acedf61384a52af79263053de4f5\": rpc error: code = NotFound desc = could not find container \"ed5c0d0d29cd0fdfbcabfda109f7a94e42f2acedf61384a52af79263053de4f5\": container with ID starting with ed5c0d0d29cd0fdfbcabfda109f7a94e42f2acedf61384a52af79263053de4f5 not found: ID does not exist" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.687272 4563 scope.go:117] "RemoveContainer" containerID="8e45d102bfc00337918f022eba7fdc2ec81c58445a0f8a5b0918143e9e719448" Nov 24 09:21:16 crc kubenswrapper[4563]: E1124 09:21:16.687948 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e45d102bfc00337918f022eba7fdc2ec81c58445a0f8a5b0918143e9e719448\": container with ID starting with 8e45d102bfc00337918f022eba7fdc2ec81c58445a0f8a5b0918143e9e719448 not found: ID does not exist" containerID="8e45d102bfc00337918f022eba7fdc2ec81c58445a0f8a5b0918143e9e719448" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.687995 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e45d102bfc00337918f022eba7fdc2ec81c58445a0f8a5b0918143e9e719448"} err="failed to get container status \"8e45d102bfc00337918f022eba7fdc2ec81c58445a0f8a5b0918143e9e719448\": rpc error: code = NotFound desc = could not find container \"8e45d102bfc00337918f022eba7fdc2ec81c58445a0f8a5b0918143e9e719448\": container with ID starting with 8e45d102bfc00337918f022eba7fdc2ec81c58445a0f8a5b0918143e9e719448 not found: ID does not exist" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.695030 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:21:16 crc kubenswrapper[4563]: E1124 09:21:16.700988 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0301f820-dbcb-4cad-a180-7aed80a46db6" containerName="nova-metadata-metadata" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.701030 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0301f820-dbcb-4cad-a180-7aed80a46db6" containerName="nova-metadata-metadata" Nov 24 09:21:16 crc kubenswrapper[4563]: E1124 09:21:16.701250 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0301f820-dbcb-4cad-a180-7aed80a46db6" containerName="nova-metadata-log" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.701265 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0301f820-dbcb-4cad-a180-7aed80a46db6" containerName="nova-metadata-log" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.701616 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="0301f820-dbcb-4cad-a180-7aed80a46db6" containerName="nova-metadata-metadata" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.701628 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="0301f820-dbcb-4cad-a180-7aed80a46db6" containerName="nova-metadata-log" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.703157 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.705698 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.706075 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.708576 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.857615 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtd5t\" (UniqueName: \"kubernetes.io/projected/c17fb818-0e53-4655-89ac-a1bb9022b5f8-kube-api-access-mtd5t\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.860730 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c17fb818-0e53-4655-89ac-a1bb9022b5f8-config-data\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.860883 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c17fb818-0e53-4655-89ac-a1bb9022b5f8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.861211 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c17fb818-0e53-4655-89ac-a1bb9022b5f8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.861507 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c17fb818-0e53-4655-89ac-a1bb9022b5f8-logs\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.964178 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c17fb818-0e53-4655-89ac-a1bb9022b5f8-logs\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.964326 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtd5t\" (UniqueName: \"kubernetes.io/projected/c17fb818-0e53-4655-89ac-a1bb9022b5f8-kube-api-access-mtd5t\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.964443 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c17fb818-0e53-4655-89ac-a1bb9022b5f8-config-data\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.964620 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c17fb818-0e53-4655-89ac-a1bb9022b5f8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.964663 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c17fb818-0e53-4655-89ac-a1bb9022b5f8-logs\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.964759 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c17fb818-0e53-4655-89ac-a1bb9022b5f8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.970228 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c17fb818-0e53-4655-89ac-a1bb9022b5f8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.970549 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c17fb818-0e53-4655-89ac-a1bb9022b5f8-config-data\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.980002 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c17fb818-0e53-4655-89ac-a1bb9022b5f8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:16 crc kubenswrapper[4563]: I1124 09:21:16.980918 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtd5t\" (UniqueName: \"kubernetes.io/projected/c17fb818-0e53-4655-89ac-a1bb9022b5f8-kube-api-access-mtd5t\") pod \"nova-metadata-0\" (UID: \"c17fb818-0e53-4655-89ac-a1bb9022b5f8\") " pod="openstack/nova-metadata-0" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.065083 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0301f820-dbcb-4cad-a180-7aed80a46db6" path="/var/lib/kubelet/pods/0301f820-dbcb-4cad-a180-7aed80a46db6/volumes" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.094551 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.100754 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.270434 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5021efe4-f1d2-4762-a196-c2f9b4266ba9-combined-ca-bundle\") pod \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\" (UID: \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\") " Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.270484 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5021efe4-f1d2-4762-a196-c2f9b4266ba9-config-data\") pod \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\" (UID: \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\") " Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.271456 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvm8r\" (UniqueName: \"kubernetes.io/projected/5021efe4-f1d2-4762-a196-c2f9b4266ba9-kube-api-access-gvm8r\") pod \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\" (UID: \"5021efe4-f1d2-4762-a196-c2f9b4266ba9\") " Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.276239 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5021efe4-f1d2-4762-a196-c2f9b4266ba9-kube-api-access-gvm8r" (OuterVolumeSpecName: "kube-api-access-gvm8r") pod "5021efe4-f1d2-4762-a196-c2f9b4266ba9" (UID: "5021efe4-f1d2-4762-a196-c2f9b4266ba9"). InnerVolumeSpecName "kube-api-access-gvm8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.300345 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5021efe4-f1d2-4762-a196-c2f9b4266ba9-config-data" (OuterVolumeSpecName: "config-data") pod "5021efe4-f1d2-4762-a196-c2f9b4266ba9" (UID: "5021efe4-f1d2-4762-a196-c2f9b4266ba9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.301897 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5021efe4-f1d2-4762-a196-c2f9b4266ba9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5021efe4-f1d2-4762-a196-c2f9b4266ba9" (UID: "5021efe4-f1d2-4762-a196-c2f9b4266ba9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.375007 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvm8r\" (UniqueName: \"kubernetes.io/projected/5021efe4-f1d2-4762-a196-c2f9b4266ba9-kube-api-access-gvm8r\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.375042 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5021efe4-f1d2-4762-a196-c2f9b4266ba9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.375055 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5021efe4-f1d2-4762-a196-c2f9b4266ba9-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.533254 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.643298 4563 generic.go:334] "Generic (PLEG): container finished" podID="5021efe4-f1d2-4762-a196-c2f9b4266ba9" containerID="43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7" exitCode=0 Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.643410 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.643408 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5021efe4-f1d2-4762-a196-c2f9b4266ba9","Type":"ContainerDied","Data":"43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7"} Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.643514 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5021efe4-f1d2-4762-a196-c2f9b4266ba9","Type":"ContainerDied","Data":"7bc4a36f059098286445a01e2c4698c1567df2c7826f3c204cb759a90fe7a609"} Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.643570 4563 scope.go:117] "RemoveContainer" containerID="43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.645565 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c17fb818-0e53-4655-89ac-a1bb9022b5f8","Type":"ContainerStarted","Data":"119c6069cc041606b2ac56e817f20fd41fd05f4e8041a1564f276a4f92ea93c3"} Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.672145 4563 scope.go:117] "RemoveContainer" containerID="43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7" Nov 24 09:21:17 crc kubenswrapper[4563]: E1124 09:21:17.674328 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7\": container with ID starting with 43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7 not found: ID does not exist" containerID="43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.674366 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7"} err="failed to get container status \"43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7\": rpc error: code = NotFound desc = could not find container \"43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7\": container with ID starting with 43fe1e4be27621f18430e2fd2123c205a977749861ba4d745645d70c938bb9c7 not found: ID does not exist" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.678374 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.685936 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.694217 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:21:17 crc kubenswrapper[4563]: E1124 09:21:17.694677 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5021efe4-f1d2-4762-a196-c2f9b4266ba9" containerName="nova-scheduler-scheduler" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.694696 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="5021efe4-f1d2-4762-a196-c2f9b4266ba9" containerName="nova-scheduler-scheduler" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.703128 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="5021efe4-f1d2-4762-a196-c2f9b4266ba9" containerName="nova-scheduler-scheduler" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.703843 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.706236 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.706245 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.786761 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38360dce-8f0e-42b1-ba4c-d13036b2794a-config-data\") pod \"nova-scheduler-0\" (UID: \"38360dce-8f0e-42b1-ba4c-d13036b2794a\") " pod="openstack/nova-scheduler-0" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.786848 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tz48\" (UniqueName: \"kubernetes.io/projected/38360dce-8f0e-42b1-ba4c-d13036b2794a-kube-api-access-5tz48\") pod \"nova-scheduler-0\" (UID: \"38360dce-8f0e-42b1-ba4c-d13036b2794a\") " pod="openstack/nova-scheduler-0" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.787039 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38360dce-8f0e-42b1-ba4c-d13036b2794a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"38360dce-8f0e-42b1-ba4c-d13036b2794a\") " pod="openstack/nova-scheduler-0" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.889084 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38360dce-8f0e-42b1-ba4c-d13036b2794a-config-data\") pod \"nova-scheduler-0\" (UID: \"38360dce-8f0e-42b1-ba4c-d13036b2794a\") " pod="openstack/nova-scheduler-0" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.889406 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tz48\" (UniqueName: \"kubernetes.io/projected/38360dce-8f0e-42b1-ba4c-d13036b2794a-kube-api-access-5tz48\") pod \"nova-scheduler-0\" (UID: \"38360dce-8f0e-42b1-ba4c-d13036b2794a\") " pod="openstack/nova-scheduler-0" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.889479 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38360dce-8f0e-42b1-ba4c-d13036b2794a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"38360dce-8f0e-42b1-ba4c-d13036b2794a\") " pod="openstack/nova-scheduler-0" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.893948 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38360dce-8f0e-42b1-ba4c-d13036b2794a-config-data\") pod \"nova-scheduler-0\" (UID: \"38360dce-8f0e-42b1-ba4c-d13036b2794a\") " pod="openstack/nova-scheduler-0" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.894362 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38360dce-8f0e-42b1-ba4c-d13036b2794a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"38360dce-8f0e-42b1-ba4c-d13036b2794a\") " pod="openstack/nova-scheduler-0" Nov 24 09:21:17 crc kubenswrapper[4563]: I1124 09:21:17.907128 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tz48\" (UniqueName: \"kubernetes.io/projected/38360dce-8f0e-42b1-ba4c-d13036b2794a-kube-api-access-5tz48\") pod \"nova-scheduler-0\" (UID: \"38360dce-8f0e-42b1-ba4c-d13036b2794a\") " pod="openstack/nova-scheduler-0" Nov 24 09:21:18 crc kubenswrapper[4563]: I1124 09:21:18.026383 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 24 09:21:18 crc kubenswrapper[4563]: I1124 09:21:18.419461 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 24 09:21:18 crc kubenswrapper[4563]: W1124 09:21:18.423550 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38360dce_8f0e_42b1_ba4c_d13036b2794a.slice/crio-bfd0cad2940afa18fcb31d8b3b1fbeaa88e29b32f4986205601321fccab3cdbd WatchSource:0}: Error finding container bfd0cad2940afa18fcb31d8b3b1fbeaa88e29b32f4986205601321fccab3cdbd: Status 404 returned error can't find the container with id bfd0cad2940afa18fcb31d8b3b1fbeaa88e29b32f4986205601321fccab3cdbd Nov 24 09:21:18 crc kubenswrapper[4563]: I1124 09:21:18.658193 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"38360dce-8f0e-42b1-ba4c-d13036b2794a","Type":"ContainerStarted","Data":"39aa728d4b04bb70539862668516d277dedd4fcd97d49e7694df9d9fbdcabb60"} Nov 24 09:21:18 crc kubenswrapper[4563]: I1124 09:21:18.659563 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"38360dce-8f0e-42b1-ba4c-d13036b2794a","Type":"ContainerStarted","Data":"bfd0cad2940afa18fcb31d8b3b1fbeaa88e29b32f4986205601321fccab3cdbd"} Nov 24 09:21:18 crc kubenswrapper[4563]: I1124 09:21:18.661510 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c17fb818-0e53-4655-89ac-a1bb9022b5f8","Type":"ContainerStarted","Data":"c593ed13301feda50bb77b907898006d19b5b7c2fc29ffce2b8580c77d1df599"} Nov 24 09:21:18 crc kubenswrapper[4563]: I1124 09:21:18.661562 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c17fb818-0e53-4655-89ac-a1bb9022b5f8","Type":"ContainerStarted","Data":"0b4bb512ad36f9151ee248dc142887411a1458c2e8c56b58f0e02930a6bd0bb5"} Nov 24 09:21:18 crc kubenswrapper[4563]: I1124 09:21:18.682466 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.682447421 podStartE2EDuration="1.682447421s" podCreationTimestamp="2025-11-24 09:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:21:18.676119728 +0000 UTC m=+1055.935097175" watchObservedRunningTime="2025-11-24 09:21:18.682447421 +0000 UTC m=+1055.941424868" Nov 24 09:21:18 crc kubenswrapper[4563]: I1124 09:21:18.691529 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.69151478 podStartE2EDuration="2.69151478s" podCreationTimestamp="2025-11-24 09:21:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:21:18.688866317 +0000 UTC m=+1055.947843763" watchObservedRunningTime="2025-11-24 09:21:18.69151478 +0000 UTC m=+1055.950492228" Nov 24 09:21:19 crc kubenswrapper[4563]: I1124 09:21:19.063170 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5021efe4-f1d2-4762-a196-c2f9b4266ba9" path="/var/lib/kubelet/pods/5021efe4-f1d2-4762-a196-c2f9b4266ba9/volumes" Nov 24 09:21:22 crc kubenswrapper[4563]: I1124 09:21:22.100965 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 09:21:22 crc kubenswrapper[4563]: I1124 09:21:22.101287 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 24 09:21:23 crc kubenswrapper[4563]: I1124 09:21:23.027295 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 24 09:21:24 crc kubenswrapper[4563]: I1124 09:21:24.545420 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 09:21:24 crc kubenswrapper[4563]: I1124 09:21:24.545492 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 24 09:21:25 crc kubenswrapper[4563]: I1124 09:21:25.562767 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 24 09:21:25 crc kubenswrapper[4563]: I1124 09:21:25.562818 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 24 09:21:27 crc kubenswrapper[4563]: I1124 09:21:27.101628 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 09:21:27 crc kubenswrapper[4563]: I1124 09:21:27.101723 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 24 09:21:28 crc kubenswrapper[4563]: I1124 09:21:28.027219 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 24 09:21:28 crc kubenswrapper[4563]: I1124 09:21:28.049250 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 24 09:21:28 crc kubenswrapper[4563]: I1124 09:21:28.119784 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c17fb818-0e53-4655-89ac-a1bb9022b5f8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 24 09:21:28 crc kubenswrapper[4563]: I1124 09:21:28.119787 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c17fb818-0e53-4655-89ac-a1bb9022b5f8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 24 09:21:28 crc kubenswrapper[4563]: I1124 09:21:28.780006 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 24 09:21:30 crc kubenswrapper[4563]: I1124 09:21:30.794605 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 24 09:21:34 crc kubenswrapper[4563]: I1124 09:21:34.552494 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 24 09:21:34 crc kubenswrapper[4563]: I1124 09:21:34.552993 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 24 09:21:34 crc kubenswrapper[4563]: I1124 09:21:34.553587 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 24 09:21:34 crc kubenswrapper[4563]: I1124 09:21:34.553609 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 24 09:21:34 crc kubenswrapper[4563]: I1124 09:21:34.557603 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 24 09:21:34 crc kubenswrapper[4563]: I1124 09:21:34.558990 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 24 09:21:37 crc kubenswrapper[4563]: I1124 09:21:37.105944 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 24 09:21:37 crc kubenswrapper[4563]: I1124 09:21:37.106602 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 24 09:21:37 crc kubenswrapper[4563]: I1124 09:21:37.110688 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 24 09:21:37 crc kubenswrapper[4563]: I1124 09:21:37.110875 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 24 09:21:43 crc kubenswrapper[4563]: I1124 09:21:43.253030 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 09:21:43 crc kubenswrapper[4563]: I1124 09:21:43.897114 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 09:21:46 crc kubenswrapper[4563]: I1124 09:21:46.835792 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="18ec698b-354c-4d4e-9126-16c493474617" containerName="rabbitmq" containerID="cri-o://42642e2a9a6e213f467e573d7fa90a42a4dc0cb5e3ac48057c129287a5436410" gracePeriod=604797 Nov 24 09:21:47 crc kubenswrapper[4563]: I1124 09:21:47.587840 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e4286a17-bf24-4c91-91cb-6e3f3d731d24" containerName="rabbitmq" containerID="cri-o://38156802d1ddfdcca5cd431c7fb277955aa2c8613d910c8ae419ae1320035f56" gracePeriod=604797 Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.011234 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="18ec698b-354c-4d4e-9126-16c493474617" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.234395 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.272219 4563 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e4286a17-bf24-4c91-91cb-6e3f3d731d24" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.372117 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlmx5\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-kube-api-access-mlmx5\") pod \"18ec698b-354c-4d4e-9126-16c493474617\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.372444 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-plugins-conf\") pod \"18ec698b-354c-4d4e-9126-16c493474617\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.372585 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-plugins\") pod \"18ec698b-354c-4d4e-9126-16c493474617\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.372703 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"18ec698b-354c-4d4e-9126-16c493474617\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.372787 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-confd\") pod \"18ec698b-354c-4d4e-9126-16c493474617\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.372854 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-server-conf\") pod \"18ec698b-354c-4d4e-9126-16c493474617\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.372965 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-erlang-cookie\") pod \"18ec698b-354c-4d4e-9126-16c493474617\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.372993 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "18ec698b-354c-4d4e-9126-16c493474617" (UID: "18ec698b-354c-4d4e-9126-16c493474617"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.373048 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-tls\") pod \"18ec698b-354c-4d4e-9126-16c493474617\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.373255 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "18ec698b-354c-4d4e-9126-16c493474617" (UID: "18ec698b-354c-4d4e-9126-16c493474617"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.373344 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "18ec698b-354c-4d4e-9126-16c493474617" (UID: "18ec698b-354c-4d4e-9126-16c493474617"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.374038 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/18ec698b-354c-4d4e-9126-16c493474617-pod-info\") pod \"18ec698b-354c-4d4e-9126-16c493474617\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.374092 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-config-data\") pod \"18ec698b-354c-4d4e-9126-16c493474617\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.374131 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/18ec698b-354c-4d4e-9126-16c493474617-erlang-cookie-secret\") pod \"18ec698b-354c-4d4e-9126-16c493474617\" (UID: \"18ec698b-354c-4d4e-9126-16c493474617\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.375093 4563 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.375114 4563 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.375125 4563 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.378804 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "18ec698b-354c-4d4e-9126-16c493474617" (UID: "18ec698b-354c-4d4e-9126-16c493474617"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.378865 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-kube-api-access-mlmx5" (OuterVolumeSpecName: "kube-api-access-mlmx5") pod "18ec698b-354c-4d4e-9126-16c493474617" (UID: "18ec698b-354c-4d4e-9126-16c493474617"). InnerVolumeSpecName "kube-api-access-mlmx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.379743 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/18ec698b-354c-4d4e-9126-16c493474617-pod-info" (OuterVolumeSpecName: "pod-info") pod "18ec698b-354c-4d4e-9126-16c493474617" (UID: "18ec698b-354c-4d4e-9126-16c493474617"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.387723 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18ec698b-354c-4d4e-9126-16c493474617-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "18ec698b-354c-4d4e-9126-16c493474617" (UID: "18ec698b-354c-4d4e-9126-16c493474617"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.387795 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "18ec698b-354c-4d4e-9126-16c493474617" (UID: "18ec698b-354c-4d4e-9126-16c493474617"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.395520 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-config-data" (OuterVolumeSpecName: "config-data") pod "18ec698b-354c-4d4e-9126-16c493474617" (UID: "18ec698b-354c-4d4e-9126-16c493474617"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.416501 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-server-conf" (OuterVolumeSpecName: "server-conf") pod "18ec698b-354c-4d4e-9126-16c493474617" (UID: "18ec698b-354c-4d4e-9126-16c493474617"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.456429 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "18ec698b-354c-4d4e-9126-16c493474617" (UID: "18ec698b-354c-4d4e-9126-16c493474617"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.477069 4563 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.477098 4563 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-server-conf\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.477107 4563 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.477115 4563 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/18ec698b-354c-4d4e-9126-16c493474617-pod-info\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.477124 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18ec698b-354c-4d4e-9126-16c493474617-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.477133 4563 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/18ec698b-354c-4d4e-9126-16c493474617-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.477142 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlmx5\" (UniqueName: \"kubernetes.io/projected/18ec698b-354c-4d4e-9126-16c493474617-kube-api-access-mlmx5\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.477170 4563 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.492246 4563 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.580853 4563 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.921355 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.986599 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckl45\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-kube-api-access-ckl45\") pod \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.986657 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.986688 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e4286a17-bf24-4c91-91cb-6e3f3d731d24-pod-info\") pod \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.986719 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-erlang-cookie\") pod \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.986742 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-plugins\") pod \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.986780 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-plugins-conf\") pod \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.986827 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-server-conf\") pod \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.986857 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-config-data\") pod \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.986897 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e4286a17-bf24-4c91-91cb-6e3f3d731d24-erlang-cookie-secret\") pod \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.986919 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-tls\") pod \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.986967 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-confd\") pod \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\" (UID: \"e4286a17-bf24-4c91-91cb-6e3f3d731d24\") " Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.988596 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e4286a17-bf24-4c91-91cb-6e3f3d731d24" (UID: "e4286a17-bf24-4c91-91cb-6e3f3d731d24"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.991774 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e4286a17-bf24-4c91-91cb-6e3f3d731d24" (UID: "e4286a17-bf24-4c91-91cb-6e3f3d731d24"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.993027 4563 generic.go:334] "Generic (PLEG): container finished" podID="e4286a17-bf24-4c91-91cb-6e3f3d731d24" containerID="38156802d1ddfdcca5cd431c7fb277955aa2c8613d910c8ae419ae1320035f56" exitCode=0 Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.994097 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e4286a17-bf24-4c91-91cb-6e3f3d731d24" (UID: "e4286a17-bf24-4c91-91cb-6e3f3d731d24"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.994843 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.995675 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e4286a17-bf24-4c91-91cb-6e3f3d731d24","Type":"ContainerDied","Data":"38156802d1ddfdcca5cd431c7fb277955aa2c8613d910c8ae419ae1320035f56"} Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.995723 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e4286a17-bf24-4c91-91cb-6e3f3d731d24","Type":"ContainerDied","Data":"f96a73c1ef89c6805e0fed827ea5f529bbe6267141e628c088464edcf9779f53"} Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.995744 4563 scope.go:117] "RemoveContainer" containerID="38156802d1ddfdcca5cd431c7fb277955aa2c8613d910c8ae419ae1320035f56" Nov 24 09:21:53 crc kubenswrapper[4563]: I1124 09:21:53.996949 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-kube-api-access-ckl45" (OuterVolumeSpecName: "kube-api-access-ckl45") pod "e4286a17-bf24-4c91-91cb-6e3f3d731d24" (UID: "e4286a17-bf24-4c91-91cb-6e3f3d731d24"). InnerVolumeSpecName "kube-api-access-ckl45". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.001233 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e4286a17-bf24-4c91-91cb-6e3f3d731d24" (UID: "e4286a17-bf24-4c91-91cb-6e3f3d731d24"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.007266 4563 generic.go:334] "Generic (PLEG): container finished" podID="18ec698b-354c-4d4e-9126-16c493474617" containerID="42642e2a9a6e213f467e573d7fa90a42a4dc0cb5e3ac48057c129287a5436410" exitCode=0 Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.007405 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.007487 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"18ec698b-354c-4d4e-9126-16c493474617","Type":"ContainerDied","Data":"42642e2a9a6e213f467e573d7fa90a42a4dc0cb5e3ac48057c129287a5436410"} Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.007525 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"18ec698b-354c-4d4e-9126-16c493474617","Type":"ContainerDied","Data":"9f9424183d254ee3d4d77aeb91526ff42f93d3ed3e47cb81dcfc8c5f41cc2afc"} Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.009853 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4286a17-bf24-4c91-91cb-6e3f3d731d24-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e4286a17-bf24-4c91-91cb-6e3f3d731d24" (UID: "e4286a17-bf24-4c91-91cb-6e3f3d731d24"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.010659 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "e4286a17-bf24-4c91-91cb-6e3f3d731d24" (UID: "e4286a17-bf24-4c91-91cb-6e3f3d731d24"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.012759 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e4286a17-bf24-4c91-91cb-6e3f3d731d24-pod-info" (OuterVolumeSpecName: "pod-info") pod "e4286a17-bf24-4c91-91cb-6e3f3d731d24" (UID: "e4286a17-bf24-4c91-91cb-6e3f3d731d24"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.028539 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-config-data" (OuterVolumeSpecName: "config-data") pod "e4286a17-bf24-4c91-91cb-6e3f3d731d24" (UID: "e4286a17-bf24-4c91-91cb-6e3f3d731d24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.049482 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-server-conf" (OuterVolumeSpecName: "server-conf") pod "e4286a17-bf24-4c91-91cb-6e3f3d731d24" (UID: "e4286a17-bf24-4c91-91cb-6e3f3d731d24"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.089543 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckl45\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-kube-api-access-ckl45\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.089588 4563 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.089600 4563 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e4286a17-bf24-4c91-91cb-6e3f3d731d24-pod-info\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.089610 4563 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.089618 4563 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.089626 4563 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.089735 4563 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-server-conf\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.089746 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e4286a17-bf24-4c91-91cb-6e3f3d731d24-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.089754 4563 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e4286a17-bf24-4c91-91cb-6e3f3d731d24-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.089763 4563 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.089784 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e4286a17-bf24-4c91-91cb-6e3f3d731d24" (UID: "e4286a17-bf24-4c91-91cb-6e3f3d731d24"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.108000 4563 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.152000 4563 scope.go:117] "RemoveContainer" containerID="73d417eaecd31e6125b6cbd867865adffe9c7cd078cefc1bbdf0d4e6e9eec4e1" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.160555 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.168299 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.175283 4563 scope.go:117] "RemoveContainer" containerID="38156802d1ddfdcca5cd431c7fb277955aa2c8613d910c8ae419ae1320035f56" Nov 24 09:21:54 crc kubenswrapper[4563]: E1124 09:21:54.175664 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38156802d1ddfdcca5cd431c7fb277955aa2c8613d910c8ae419ae1320035f56\": container with ID starting with 38156802d1ddfdcca5cd431c7fb277955aa2c8613d910c8ae419ae1320035f56 not found: ID does not exist" containerID="38156802d1ddfdcca5cd431c7fb277955aa2c8613d910c8ae419ae1320035f56" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.175693 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38156802d1ddfdcca5cd431c7fb277955aa2c8613d910c8ae419ae1320035f56"} err="failed to get container status \"38156802d1ddfdcca5cd431c7fb277955aa2c8613d910c8ae419ae1320035f56\": rpc error: code = NotFound desc = could not find container \"38156802d1ddfdcca5cd431c7fb277955aa2c8613d910c8ae419ae1320035f56\": container with ID starting with 38156802d1ddfdcca5cd431c7fb277955aa2c8613d910c8ae419ae1320035f56 not found: ID does not exist" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.175717 4563 scope.go:117] "RemoveContainer" containerID="73d417eaecd31e6125b6cbd867865adffe9c7cd078cefc1bbdf0d4e6e9eec4e1" Nov 24 09:21:54 crc kubenswrapper[4563]: E1124 09:21:54.177829 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d417eaecd31e6125b6cbd867865adffe9c7cd078cefc1bbdf0d4e6e9eec4e1\": container with ID starting with 73d417eaecd31e6125b6cbd867865adffe9c7cd078cefc1bbdf0d4e6e9eec4e1 not found: ID does not exist" containerID="73d417eaecd31e6125b6cbd867865adffe9c7cd078cefc1bbdf0d4e6e9eec4e1" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.177878 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d417eaecd31e6125b6cbd867865adffe9c7cd078cefc1bbdf0d4e6e9eec4e1"} err="failed to get container status \"73d417eaecd31e6125b6cbd867865adffe9c7cd078cefc1bbdf0d4e6e9eec4e1\": rpc error: code = NotFound desc = could not find container \"73d417eaecd31e6125b6cbd867865adffe9c7cd078cefc1bbdf0d4e6e9eec4e1\": container with ID starting with 73d417eaecd31e6125b6cbd867865adffe9c7cd078cefc1bbdf0d4e6e9eec4e1 not found: ID does not exist" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.177909 4563 scope.go:117] "RemoveContainer" containerID="42642e2a9a6e213f467e573d7fa90a42a4dc0cb5e3ac48057c129287a5436410" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.186425 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 09:21:54 crc kubenswrapper[4563]: E1124 09:21:54.187040 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ec698b-354c-4d4e-9126-16c493474617" containerName="setup-container" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.187060 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ec698b-354c-4d4e-9126-16c493474617" containerName="setup-container" Nov 24 09:21:54 crc kubenswrapper[4563]: E1124 09:21:54.187075 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ec698b-354c-4d4e-9126-16c493474617" containerName="rabbitmq" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.187081 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ec698b-354c-4d4e-9126-16c493474617" containerName="rabbitmq" Nov 24 09:21:54 crc kubenswrapper[4563]: E1124 09:21:54.187100 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4286a17-bf24-4c91-91cb-6e3f3d731d24" containerName="setup-container" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.187108 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4286a17-bf24-4c91-91cb-6e3f3d731d24" containerName="setup-container" Nov 24 09:21:54 crc kubenswrapper[4563]: E1124 09:21:54.187137 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4286a17-bf24-4c91-91cb-6e3f3d731d24" containerName="rabbitmq" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.187143 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4286a17-bf24-4c91-91cb-6e3f3d731d24" containerName="rabbitmq" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.187370 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4286a17-bf24-4c91-91cb-6e3f3d731d24" containerName="rabbitmq" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.187387 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ec698b-354c-4d4e-9126-16c493474617" containerName="rabbitmq" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.188619 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.191185 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.191360 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.191631 4563 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e4286a17-bf24-4c91-91cb-6e3f3d731d24-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.191678 4563 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.192206 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.192382 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.192528 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-pntpb" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.192662 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.192778 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.201277 4563 scope.go:117] "RemoveContainer" containerID="92de487af76ff9936312b95bbfc2bef78c4d67ca7362338d5d1ace860caa89c0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.213290 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.221609 4563 scope.go:117] "RemoveContainer" containerID="42642e2a9a6e213f467e573d7fa90a42a4dc0cb5e3ac48057c129287a5436410" Nov 24 09:21:54 crc kubenswrapper[4563]: E1124 09:21:54.222052 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42642e2a9a6e213f467e573d7fa90a42a4dc0cb5e3ac48057c129287a5436410\": container with ID starting with 42642e2a9a6e213f467e573d7fa90a42a4dc0cb5e3ac48057c129287a5436410 not found: ID does not exist" containerID="42642e2a9a6e213f467e573d7fa90a42a4dc0cb5e3ac48057c129287a5436410" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.222080 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42642e2a9a6e213f467e573d7fa90a42a4dc0cb5e3ac48057c129287a5436410"} err="failed to get container status \"42642e2a9a6e213f467e573d7fa90a42a4dc0cb5e3ac48057c129287a5436410\": rpc error: code = NotFound desc = could not find container \"42642e2a9a6e213f467e573d7fa90a42a4dc0cb5e3ac48057c129287a5436410\": container with ID starting with 42642e2a9a6e213f467e573d7fa90a42a4dc0cb5e3ac48057c129287a5436410 not found: ID does not exist" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.222104 4563 scope.go:117] "RemoveContainer" containerID="92de487af76ff9936312b95bbfc2bef78c4d67ca7362338d5d1ace860caa89c0" Nov 24 09:21:54 crc kubenswrapper[4563]: E1124 09:21:54.222386 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92de487af76ff9936312b95bbfc2bef78c4d67ca7362338d5d1ace860caa89c0\": container with ID starting with 92de487af76ff9936312b95bbfc2bef78c4d67ca7362338d5d1ace860caa89c0 not found: ID does not exist" containerID="92de487af76ff9936312b95bbfc2bef78c4d67ca7362338d5d1ace860caa89c0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.222404 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92de487af76ff9936312b95bbfc2bef78c4d67ca7362338d5d1ace860caa89c0"} err="failed to get container status \"92de487af76ff9936312b95bbfc2bef78c4d67ca7362338d5d1ace860caa89c0\": rpc error: code = NotFound desc = could not find container \"92de487af76ff9936312b95bbfc2bef78c4d67ca7362338d5d1ace860caa89c0\": container with ID starting with 92de487af76ff9936312b95bbfc2bef78c4d67ca7362338d5d1ace860caa89c0 not found: ID does not exist" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.288866 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-844899475f-c2kkp"] Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.290597 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.293362 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/45003ba2-beec-43e7-9248-42c517ed3bf7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.293411 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45003ba2-beec-43e7-9248-42c517ed3bf7-config-data\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.293443 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/45003ba2-beec-43e7-9248-42c517ed3bf7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.293463 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/45003ba2-beec-43e7-9248-42c517ed3bf7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.293494 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rn94\" (UniqueName: \"kubernetes.io/projected/45003ba2-beec-43e7-9248-42c517ed3bf7-kube-api-access-6rn94\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.293518 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/45003ba2-beec-43e7-9248-42c517ed3bf7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.293577 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/45003ba2-beec-43e7-9248-42c517ed3bf7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.293688 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.293732 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/45003ba2-beec-43e7-9248-42c517ed3bf7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.293780 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/45003ba2-beec-43e7-9248-42c517ed3bf7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.294063 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/45003ba2-beec-43e7-9248-42c517ed3bf7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.300203 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.315446 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844899475f-c2kkp"] Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.349676 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.357455 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.377457 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.379659 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.381814 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.382092 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wfdxh" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.382229 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.383668 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.383817 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.384314 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.384837 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396127 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/45003ba2-beec-43e7-9248-42c517ed3bf7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396193 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wxtl\" (UniqueName: \"kubernetes.io/projected/4e9c5eaf-257a-4346-8516-a48a9a2398aa-kube-api-access-6wxtl\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396243 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396275 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-config\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396297 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/45003ba2-beec-43e7-9248-42c517ed3bf7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396365 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-ovsdbserver-sb\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396405 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/45003ba2-beec-43e7-9248-42c517ed3bf7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396559 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396662 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/45003ba2-beec-43e7-9248-42c517ed3bf7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396691 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/45003ba2-beec-43e7-9248-42c517ed3bf7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396764 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-ovsdbserver-nb\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396842 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-openstack-edpm-ipam\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396883 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-dns-swift-storage-0\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396935 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/45003ba2-beec-43e7-9248-42c517ed3bf7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.396977 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45003ba2-beec-43e7-9248-42c517ed3bf7-config-data\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.397010 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/45003ba2-beec-43e7-9248-42c517ed3bf7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.397033 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/45003ba2-beec-43e7-9248-42c517ed3bf7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.397076 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-dns-svc\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.397104 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rn94\" (UniqueName: \"kubernetes.io/projected/45003ba2-beec-43e7-9248-42c517ed3bf7-kube-api-access-6rn94\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.397130 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/45003ba2-beec-43e7-9248-42c517ed3bf7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.397671 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/45003ba2-beec-43e7-9248-42c517ed3bf7-config-data\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.397957 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/45003ba2-beec-43e7-9248-42c517ed3bf7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.397962 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/45003ba2-beec-43e7-9248-42c517ed3bf7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.398006 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/45003ba2-beec-43e7-9248-42c517ed3bf7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.402605 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/45003ba2-beec-43e7-9248-42c517ed3bf7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.406188 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/45003ba2-beec-43e7-9248-42c517ed3bf7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.406693 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/45003ba2-beec-43e7-9248-42c517ed3bf7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.409263 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/45003ba2-beec-43e7-9248-42c517ed3bf7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.410472 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.412180 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rn94\" (UniqueName: \"kubernetes.io/projected/45003ba2-beec-43e7-9248-42c517ed3bf7-kube-api-access-6rn94\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.426100 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"45003ba2-beec-43e7-9248-42c517ed3bf7\") " pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498410 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-ovsdbserver-nb\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498470 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-openstack-edpm-ipam\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498492 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-dns-swift-storage-0\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498519 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62faa658-2c71-4afe-9fc2-4d9fd0079928-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498560 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498581 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/62faa658-2c71-4afe-9fc2-4d9fd0079928-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498606 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-dns-svc\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498624 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62faa658-2c71-4afe-9fc2-4d9fd0079928-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498665 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62faa658-2c71-4afe-9fc2-4d9fd0079928-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498724 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62faa658-2c71-4afe-9fc2-4d9fd0079928-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498743 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c87h\" (UniqueName: \"kubernetes.io/projected/62faa658-2c71-4afe-9fc2-4d9fd0079928-kube-api-access-5c87h\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498862 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62faa658-2c71-4afe-9fc2-4d9fd0079928-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498892 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62faa658-2c71-4afe-9fc2-4d9fd0079928-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498922 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62faa658-2c71-4afe-9fc2-4d9fd0079928-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.498947 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wxtl\" (UniqueName: \"kubernetes.io/projected/4e9c5eaf-257a-4346-8516-a48a9a2398aa-kube-api-access-6wxtl\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.499217 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-config\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.499258 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-ovsdbserver-sb\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.499296 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62faa658-2c71-4afe-9fc2-4d9fd0079928-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.499819 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-openstack-edpm-ipam\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.499913 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-dns-svc\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.500074 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-ovsdbserver-sb\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.500362 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-config\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.500429 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-dns-swift-storage-0\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.500620 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-ovsdbserver-nb\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.506283 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.515492 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wxtl\" (UniqueName: \"kubernetes.io/projected/4e9c5eaf-257a-4346-8516-a48a9a2398aa-kube-api-access-6wxtl\") pod \"dnsmasq-dns-844899475f-c2kkp\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.601426 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62faa658-2c71-4afe-9fc2-4d9fd0079928-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.601487 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.601516 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/62faa658-2c71-4afe-9fc2-4d9fd0079928-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.601561 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62faa658-2c71-4afe-9fc2-4d9fd0079928-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.601591 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62faa658-2c71-4afe-9fc2-4d9fd0079928-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.601746 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62faa658-2c71-4afe-9fc2-4d9fd0079928-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.601775 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c87h\" (UniqueName: \"kubernetes.io/projected/62faa658-2c71-4afe-9fc2-4d9fd0079928-kube-api-access-5c87h\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.601802 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62faa658-2c71-4afe-9fc2-4d9fd0079928-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.601834 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62faa658-2c71-4afe-9fc2-4d9fd0079928-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.601890 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62faa658-2c71-4afe-9fc2-4d9fd0079928-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.602074 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62faa658-2c71-4afe-9fc2-4d9fd0079928-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.603295 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62faa658-2c71-4afe-9fc2-4d9fd0079928-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.603783 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62faa658-2c71-4afe-9fc2-4d9fd0079928-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.604060 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62faa658-2c71-4afe-9fc2-4d9fd0079928-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.604253 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62faa658-2c71-4afe-9fc2-4d9fd0079928-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.604271 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62faa658-2c71-4afe-9fc2-4d9fd0079928-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.604391 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.605277 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.606277 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62faa658-2c71-4afe-9fc2-4d9fd0079928-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.606769 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62faa658-2c71-4afe-9fc2-4d9fd0079928-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.611966 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/62faa658-2c71-4afe-9fc2-4d9fd0079928-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.619908 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c87h\" (UniqueName: \"kubernetes.io/projected/62faa658-2c71-4afe-9fc2-4d9fd0079928-kube-api-access-5c87h\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.620260 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62faa658-2c71-4afe-9fc2-4d9fd0079928-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.640360 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"62faa658-2c71-4afe-9fc2-4d9fd0079928\") " pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.695467 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:21:54 crc kubenswrapper[4563]: I1124 09:21:54.932498 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 24 09:21:55 crc kubenswrapper[4563]: I1124 09:21:55.016466 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"45003ba2-beec-43e7-9248-42c517ed3bf7","Type":"ContainerStarted","Data":"22669884afb62d6df352cabaa98b43fb87043fc96ff8e61fedce047046590fe9"} Nov 24 09:21:55 crc kubenswrapper[4563]: W1124 09:21:55.045816 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e9c5eaf_257a_4346_8516_a48a9a2398aa.slice/crio-779781dfe7a4c7f683fd1d4f7e56aa958e464fbe97d775e5c8ad5a5025d5ab1f WatchSource:0}: Error finding container 779781dfe7a4c7f683fd1d4f7e56aa958e464fbe97d775e5c8ad5a5025d5ab1f: Status 404 returned error can't find the container with id 779781dfe7a4c7f683fd1d4f7e56aa958e464fbe97d775e5c8ad5a5025d5ab1f Nov 24 09:21:55 crc kubenswrapper[4563]: I1124 09:21:55.046343 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844899475f-c2kkp"] Nov 24 09:21:55 crc kubenswrapper[4563]: I1124 09:21:55.063831 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ec698b-354c-4d4e-9126-16c493474617" path="/var/lib/kubelet/pods/18ec698b-354c-4d4e-9126-16c493474617/volumes" Nov 24 09:21:55 crc kubenswrapper[4563]: I1124 09:21:55.064612 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4286a17-bf24-4c91-91cb-6e3f3d731d24" path="/var/lib/kubelet/pods/e4286a17-bf24-4c91-91cb-6e3f3d731d24/volumes" Nov 24 09:21:55 crc kubenswrapper[4563]: I1124 09:21:55.124139 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 24 09:21:55 crc kubenswrapper[4563]: W1124 09:21:55.131251 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62faa658_2c71_4afe_9fc2_4d9fd0079928.slice/crio-31206cab00e63ced25656d391a70964dbbd8c39d58e80b4d68b79b6dc12ea4c4 WatchSource:0}: Error finding container 31206cab00e63ced25656d391a70964dbbd8c39d58e80b4d68b79b6dc12ea4c4: Status 404 returned error can't find the container with id 31206cab00e63ced25656d391a70964dbbd8c39d58e80b4d68b79b6dc12ea4c4 Nov 24 09:21:56 crc kubenswrapper[4563]: I1124 09:21:56.029505 4563 generic.go:334] "Generic (PLEG): container finished" podID="4e9c5eaf-257a-4346-8516-a48a9a2398aa" containerID="92a7fc4b7c8383dd3ca3efdbc9b87d84460e339820e90cb49e3a0587ddc1dcda" exitCode=0 Nov 24 09:21:56 crc kubenswrapper[4563]: I1124 09:21:56.029721 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844899475f-c2kkp" event={"ID":"4e9c5eaf-257a-4346-8516-a48a9a2398aa","Type":"ContainerDied","Data":"92a7fc4b7c8383dd3ca3efdbc9b87d84460e339820e90cb49e3a0587ddc1dcda"} Nov 24 09:21:56 crc kubenswrapper[4563]: I1124 09:21:56.029749 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844899475f-c2kkp" event={"ID":"4e9c5eaf-257a-4346-8516-a48a9a2398aa","Type":"ContainerStarted","Data":"779781dfe7a4c7f683fd1d4f7e56aa958e464fbe97d775e5c8ad5a5025d5ab1f"} Nov 24 09:21:56 crc kubenswrapper[4563]: I1124 09:21:56.031070 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62faa658-2c71-4afe-9fc2-4d9fd0079928","Type":"ContainerStarted","Data":"31206cab00e63ced25656d391a70964dbbd8c39d58e80b4d68b79b6dc12ea4c4"} Nov 24 09:21:57 crc kubenswrapper[4563]: I1124 09:21:57.040798 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62faa658-2c71-4afe-9fc2-4d9fd0079928","Type":"ContainerStarted","Data":"e80fea6f44ae6ec56849d5018c3290062e4e6820a12a0cf9befe34e356beb705"} Nov 24 09:21:57 crc kubenswrapper[4563]: I1124 09:21:57.042627 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"45003ba2-beec-43e7-9248-42c517ed3bf7","Type":"ContainerStarted","Data":"d726d1d5e8e3d02773cdd334ebf60ba56355d7cad4146ce267c79e3ff52a1c88"} Nov 24 09:21:57 crc kubenswrapper[4563]: I1124 09:21:57.044784 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844899475f-c2kkp" event={"ID":"4e9c5eaf-257a-4346-8516-a48a9a2398aa","Type":"ContainerStarted","Data":"dc04215b7f81064d6c2632d23ac5c5ac781a8cc696779248859c15d951c48cc0"} Nov 24 09:21:57 crc kubenswrapper[4563]: I1124 09:21:57.045270 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:21:57 crc kubenswrapper[4563]: I1124 09:21:57.078708 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-844899475f-c2kkp" podStartSLOduration=3.078691104 podStartE2EDuration="3.078691104s" podCreationTimestamp="2025-11-24 09:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:21:57.07299439 +0000 UTC m=+1094.331971838" watchObservedRunningTime="2025-11-24 09:21:57.078691104 +0000 UTC m=+1094.337668551" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.607761 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.650336 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7f54fb65-2xzrp"] Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.650559 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" podUID="30ae2d40-0434-41f0-8dcc-f5e67063a428" containerName="dnsmasq-dns" containerID="cri-o://7c3e6205926990d76ca04675ed9f5ffc5ff948f981d63a7df9649ea1af2208b6" gracePeriod=10 Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.748531 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64858ddbd7-mtmng"] Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.752050 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.759886 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64858ddbd7-mtmng"] Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.887073 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-openstack-edpm-ipam\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.887299 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x54vn\" (UniqueName: \"kubernetes.io/projected/34b79993-dd96-4594-a00f-3ca0dd207e62-kube-api-access-x54vn\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.887329 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-dns-svc\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.887462 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-ovsdbserver-nb\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.887525 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-config\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.887556 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-dns-swift-storage-0\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.887693 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-ovsdbserver-sb\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.989717 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-ovsdbserver-nb\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.989791 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-config\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.989822 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-dns-swift-storage-0\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.989929 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-ovsdbserver-sb\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.990631 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-config\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.990664 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-ovsdbserver-sb\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.990632 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-ovsdbserver-nb\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.990781 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-openstack-edpm-ipam\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.990791 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-dns-swift-storage-0\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.990802 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x54vn\" (UniqueName: \"kubernetes.io/projected/34b79993-dd96-4594-a00f-3ca0dd207e62-kube-api-access-x54vn\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.990837 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-dns-svc\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.991339 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-openstack-edpm-ipam\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:04 crc kubenswrapper[4563]: I1124 09:22:04.991408 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34b79993-dd96-4594-a00f-3ca0dd207e62-dns-svc\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.009557 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x54vn\" (UniqueName: \"kubernetes.io/projected/34b79993-dd96-4594-a00f-3ca0dd207e62-kube-api-access-x54vn\") pod \"dnsmasq-dns-64858ddbd7-mtmng\" (UID: \"34b79993-dd96-4594-a00f-3ca0dd207e62\") " pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.106078 4563 generic.go:334] "Generic (PLEG): container finished" podID="30ae2d40-0434-41f0-8dcc-f5e67063a428" containerID="7c3e6205926990d76ca04675ed9f5ffc5ff948f981d63a7df9649ea1af2208b6" exitCode=0 Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.106121 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" event={"ID":"30ae2d40-0434-41f0-8dcc-f5e67063a428","Type":"ContainerDied","Data":"7c3e6205926990d76ca04675ed9f5ffc5ff948f981d63a7df9649ea1af2208b6"} Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.106148 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" event={"ID":"30ae2d40-0434-41f0-8dcc-f5e67063a428","Type":"ContainerDied","Data":"167ffbc96580c688c2fd3a06c4bcfe5296fc6da6f6a50fa8a9e124246f1a7a17"} Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.106158 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="167ffbc96580c688c2fd3a06c4bcfe5296fc6da6f6a50fa8a9e124246f1a7a17" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.124610 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.133804 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.297613 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nkt7\" (UniqueName: \"kubernetes.io/projected/30ae2d40-0434-41f0-8dcc-f5e67063a428-kube-api-access-7nkt7\") pod \"30ae2d40-0434-41f0-8dcc-f5e67063a428\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.297825 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-dns-swift-storage-0\") pod \"30ae2d40-0434-41f0-8dcc-f5e67063a428\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.297889 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-config\") pod \"30ae2d40-0434-41f0-8dcc-f5e67063a428\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.297914 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-dns-svc\") pod \"30ae2d40-0434-41f0-8dcc-f5e67063a428\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.297938 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-ovsdbserver-nb\") pod \"30ae2d40-0434-41f0-8dcc-f5e67063a428\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.298032 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-ovsdbserver-sb\") pod \"30ae2d40-0434-41f0-8dcc-f5e67063a428\" (UID: \"30ae2d40-0434-41f0-8dcc-f5e67063a428\") " Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.301243 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ae2d40-0434-41f0-8dcc-f5e67063a428-kube-api-access-7nkt7" (OuterVolumeSpecName: "kube-api-access-7nkt7") pod "30ae2d40-0434-41f0-8dcc-f5e67063a428" (UID: "30ae2d40-0434-41f0-8dcc-f5e67063a428"). InnerVolumeSpecName "kube-api-access-7nkt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.336976 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "30ae2d40-0434-41f0-8dcc-f5e67063a428" (UID: "30ae2d40-0434-41f0-8dcc-f5e67063a428"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.344304 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "30ae2d40-0434-41f0-8dcc-f5e67063a428" (UID: "30ae2d40-0434-41f0-8dcc-f5e67063a428"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.344887 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-config" (OuterVolumeSpecName: "config") pod "30ae2d40-0434-41f0-8dcc-f5e67063a428" (UID: "30ae2d40-0434-41f0-8dcc-f5e67063a428"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.350709 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "30ae2d40-0434-41f0-8dcc-f5e67063a428" (UID: "30ae2d40-0434-41f0-8dcc-f5e67063a428"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.354758 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "30ae2d40-0434-41f0-8dcc-f5e67063a428" (UID: "30ae2d40-0434-41f0-8dcc-f5e67063a428"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.400860 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nkt7\" (UniqueName: \"kubernetes.io/projected/30ae2d40-0434-41f0-8dcc-f5e67063a428-kube-api-access-7nkt7\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.400894 4563 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.400905 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.400914 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.400923 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.400931 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30ae2d40-0434-41f0-8dcc-f5e67063a428-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:05 crc kubenswrapper[4563]: I1124 09:22:05.533042 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64858ddbd7-mtmng"] Nov 24 09:22:06 crc kubenswrapper[4563]: I1124 09:22:06.114082 4563 generic.go:334] "Generic (PLEG): container finished" podID="34b79993-dd96-4594-a00f-3ca0dd207e62" containerID="36a07b0e51b25d2710ebd52efb77a14a5e54264881ace630c15e8927655d72ee" exitCode=0 Nov 24 09:22:06 crc kubenswrapper[4563]: I1124 09:22:06.114178 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" event={"ID":"34b79993-dd96-4594-a00f-3ca0dd207e62","Type":"ContainerDied","Data":"36a07b0e51b25d2710ebd52efb77a14a5e54264881ace630c15e8927655d72ee"} Nov 24 09:22:06 crc kubenswrapper[4563]: I1124 09:22:06.114328 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" event={"ID":"34b79993-dd96-4594-a00f-3ca0dd207e62","Type":"ContainerStarted","Data":"8722b2c204af2a1fd80544edf87523ce7af1e798281ad44093d65f2dac55445d"} Nov 24 09:22:06 crc kubenswrapper[4563]: I1124 09:22:06.114359 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7f54fb65-2xzrp" Nov 24 09:22:06 crc kubenswrapper[4563]: I1124 09:22:06.250204 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7f54fb65-2xzrp"] Nov 24 09:22:06 crc kubenswrapper[4563]: I1124 09:22:06.258942 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7f54fb65-2xzrp"] Nov 24 09:22:07 crc kubenswrapper[4563]: I1124 09:22:07.063808 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30ae2d40-0434-41f0-8dcc-f5e67063a428" path="/var/lib/kubelet/pods/30ae2d40-0434-41f0-8dcc-f5e67063a428/volumes" Nov 24 09:22:07 crc kubenswrapper[4563]: I1124 09:22:07.123963 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" event={"ID":"34b79993-dd96-4594-a00f-3ca0dd207e62","Type":"ContainerStarted","Data":"907e362b58292d987590d61db67e72682274ccaa94e76b41cd8b10e043ef4abf"} Nov 24 09:22:07 crc kubenswrapper[4563]: I1124 09:22:07.124122 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:07 crc kubenswrapper[4563]: I1124 09:22:07.144074 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" podStartSLOduration=3.144059053 podStartE2EDuration="3.144059053s" podCreationTimestamp="2025-11-24 09:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:22:07.137665765 +0000 UTC m=+1104.396643212" watchObservedRunningTime="2025-11-24 09:22:07.144059053 +0000 UTC m=+1104.403036500" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.125671 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64858ddbd7-mtmng" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.180428 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844899475f-c2kkp"] Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.180744 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-844899475f-c2kkp" podUID="4e9c5eaf-257a-4346-8516-a48a9a2398aa" containerName="dnsmasq-dns" containerID="cri-o://dc04215b7f81064d6c2632d23ac5c5ac781a8cc696779248859c15d951c48cc0" gracePeriod=10 Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.567909 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.588341 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-dns-swift-storage-0\") pod \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.588397 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-config\") pod \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.588416 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-ovsdbserver-sb\") pod \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.588456 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-ovsdbserver-nb\") pod \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.588480 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-dns-svc\") pod \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.588517 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-openstack-edpm-ipam\") pod \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.588564 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wxtl\" (UniqueName: \"kubernetes.io/projected/4e9c5eaf-257a-4346-8516-a48a9a2398aa-kube-api-access-6wxtl\") pod \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\" (UID: \"4e9c5eaf-257a-4346-8516-a48a9a2398aa\") " Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.593625 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e9c5eaf-257a-4346-8516-a48a9a2398aa-kube-api-access-6wxtl" (OuterVolumeSpecName: "kube-api-access-6wxtl") pod "4e9c5eaf-257a-4346-8516-a48a9a2398aa" (UID: "4e9c5eaf-257a-4346-8516-a48a9a2398aa"). InnerVolumeSpecName "kube-api-access-6wxtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.624974 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4e9c5eaf-257a-4346-8516-a48a9a2398aa" (UID: "4e9c5eaf-257a-4346-8516-a48a9a2398aa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.626792 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-config" (OuterVolumeSpecName: "config") pod "4e9c5eaf-257a-4346-8516-a48a9a2398aa" (UID: "4e9c5eaf-257a-4346-8516-a48a9a2398aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.627367 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e9c5eaf-257a-4346-8516-a48a9a2398aa" (UID: "4e9c5eaf-257a-4346-8516-a48a9a2398aa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.632084 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e9c5eaf-257a-4346-8516-a48a9a2398aa" (UID: "4e9c5eaf-257a-4346-8516-a48a9a2398aa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.640463 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e9c5eaf-257a-4346-8516-a48a9a2398aa" (UID: "4e9c5eaf-257a-4346-8516-a48a9a2398aa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.641128 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "4e9c5eaf-257a-4346-8516-a48a9a2398aa" (UID: "4e9c5eaf-257a-4346-8516-a48a9a2398aa"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.690943 4563 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.690977 4563 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.690988 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.690997 4563 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.691007 4563 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.691016 4563 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e9c5eaf-257a-4346-8516-a48a9a2398aa-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:15 crc kubenswrapper[4563]: I1124 09:22:15.691025 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wxtl\" (UniqueName: \"kubernetes.io/projected/4e9c5eaf-257a-4346-8516-a48a9a2398aa-kube-api-access-6wxtl\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:16 crc kubenswrapper[4563]: I1124 09:22:16.191201 4563 generic.go:334] "Generic (PLEG): container finished" podID="4e9c5eaf-257a-4346-8516-a48a9a2398aa" containerID="dc04215b7f81064d6c2632d23ac5c5ac781a8cc696779248859c15d951c48cc0" exitCode=0 Nov 24 09:22:16 crc kubenswrapper[4563]: I1124 09:22:16.191244 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844899475f-c2kkp" event={"ID":"4e9c5eaf-257a-4346-8516-a48a9a2398aa","Type":"ContainerDied","Data":"dc04215b7f81064d6c2632d23ac5c5ac781a8cc696779248859c15d951c48cc0"} Nov 24 09:22:16 crc kubenswrapper[4563]: I1124 09:22:16.191272 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844899475f-c2kkp" event={"ID":"4e9c5eaf-257a-4346-8516-a48a9a2398aa","Type":"ContainerDied","Data":"779781dfe7a4c7f683fd1d4f7e56aa958e464fbe97d775e5c8ad5a5025d5ab1f"} Nov 24 09:22:16 crc kubenswrapper[4563]: I1124 09:22:16.191291 4563 scope.go:117] "RemoveContainer" containerID="dc04215b7f81064d6c2632d23ac5c5ac781a8cc696779248859c15d951c48cc0" Nov 24 09:22:16 crc kubenswrapper[4563]: I1124 09:22:16.191421 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844899475f-c2kkp" Nov 24 09:22:16 crc kubenswrapper[4563]: I1124 09:22:16.211428 4563 scope.go:117] "RemoveContainer" containerID="92a7fc4b7c8383dd3ca3efdbc9b87d84460e339820e90cb49e3a0587ddc1dcda" Nov 24 09:22:16 crc kubenswrapper[4563]: I1124 09:22:16.217613 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844899475f-c2kkp"] Nov 24 09:22:16 crc kubenswrapper[4563]: I1124 09:22:16.224103 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-844899475f-c2kkp"] Nov 24 09:22:16 crc kubenswrapper[4563]: I1124 09:22:16.230260 4563 scope.go:117] "RemoveContainer" containerID="dc04215b7f81064d6c2632d23ac5c5ac781a8cc696779248859c15d951c48cc0" Nov 24 09:22:16 crc kubenswrapper[4563]: E1124 09:22:16.230901 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc04215b7f81064d6c2632d23ac5c5ac781a8cc696779248859c15d951c48cc0\": container with ID starting with dc04215b7f81064d6c2632d23ac5c5ac781a8cc696779248859c15d951c48cc0 not found: ID does not exist" containerID="dc04215b7f81064d6c2632d23ac5c5ac781a8cc696779248859c15d951c48cc0" Nov 24 09:22:16 crc kubenswrapper[4563]: I1124 09:22:16.230942 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc04215b7f81064d6c2632d23ac5c5ac781a8cc696779248859c15d951c48cc0"} err="failed to get container status \"dc04215b7f81064d6c2632d23ac5c5ac781a8cc696779248859c15d951c48cc0\": rpc error: code = NotFound desc = could not find container \"dc04215b7f81064d6c2632d23ac5c5ac781a8cc696779248859c15d951c48cc0\": container with ID starting with dc04215b7f81064d6c2632d23ac5c5ac781a8cc696779248859c15d951c48cc0 not found: ID does not exist" Nov 24 09:22:16 crc kubenswrapper[4563]: I1124 09:22:16.230974 4563 scope.go:117] "RemoveContainer" containerID="92a7fc4b7c8383dd3ca3efdbc9b87d84460e339820e90cb49e3a0587ddc1dcda" Nov 24 09:22:16 crc kubenswrapper[4563]: E1124 09:22:16.231270 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92a7fc4b7c8383dd3ca3efdbc9b87d84460e339820e90cb49e3a0587ddc1dcda\": container with ID starting with 92a7fc4b7c8383dd3ca3efdbc9b87d84460e339820e90cb49e3a0587ddc1dcda not found: ID does not exist" containerID="92a7fc4b7c8383dd3ca3efdbc9b87d84460e339820e90cb49e3a0587ddc1dcda" Nov 24 09:22:16 crc kubenswrapper[4563]: I1124 09:22:16.231353 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92a7fc4b7c8383dd3ca3efdbc9b87d84460e339820e90cb49e3a0587ddc1dcda"} err="failed to get container status \"92a7fc4b7c8383dd3ca3efdbc9b87d84460e339820e90cb49e3a0587ddc1dcda\": rpc error: code = NotFound desc = could not find container \"92a7fc4b7c8383dd3ca3efdbc9b87d84460e339820e90cb49e3a0587ddc1dcda\": container with ID starting with 92a7fc4b7c8383dd3ca3efdbc9b87d84460e339820e90cb49e3a0587ddc1dcda not found: ID does not exist" Nov 24 09:22:17 crc kubenswrapper[4563]: I1124 09:22:17.063915 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e9c5eaf-257a-4346-8516-a48a9a2398aa" path="/var/lib/kubelet/pods/4e9c5eaf-257a-4346-8516-a48a9a2398aa/volumes" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.277538 4563 generic.go:334] "Generic (PLEG): container finished" podID="62faa658-2c71-4afe-9fc2-4d9fd0079928" containerID="e80fea6f44ae6ec56849d5018c3290062e4e6820a12a0cf9befe34e356beb705" exitCode=0 Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.277628 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62faa658-2c71-4afe-9fc2-4d9fd0079928","Type":"ContainerDied","Data":"e80fea6f44ae6ec56849d5018c3290062e4e6820a12a0cf9befe34e356beb705"} Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.280322 4563 generic.go:334] "Generic (PLEG): container finished" podID="45003ba2-beec-43e7-9248-42c517ed3bf7" containerID="d726d1d5e8e3d02773cdd334ebf60ba56355d7cad4146ce267c79e3ff52a1c88" exitCode=0 Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.280355 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"45003ba2-beec-43e7-9248-42c517ed3bf7","Type":"ContainerDied","Data":"d726d1d5e8e3d02773cdd334ebf60ba56355d7cad4146ce267c79e3ff52a1c88"} Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.345511 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx"] Nov 24 09:22:28 crc kubenswrapper[4563]: E1124 09:22:28.346115 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ae2d40-0434-41f0-8dcc-f5e67063a428" containerName="dnsmasq-dns" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.346134 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ae2d40-0434-41f0-8dcc-f5e67063a428" containerName="dnsmasq-dns" Nov 24 09:22:28 crc kubenswrapper[4563]: E1124 09:22:28.346172 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ae2d40-0434-41f0-8dcc-f5e67063a428" containerName="init" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.346179 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ae2d40-0434-41f0-8dcc-f5e67063a428" containerName="init" Nov 24 09:22:28 crc kubenswrapper[4563]: E1124 09:22:28.346192 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9c5eaf-257a-4346-8516-a48a9a2398aa" containerName="dnsmasq-dns" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.346198 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9c5eaf-257a-4346-8516-a48a9a2398aa" containerName="dnsmasq-dns" Nov 24 09:22:28 crc kubenswrapper[4563]: E1124 09:22:28.346210 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e9c5eaf-257a-4346-8516-a48a9a2398aa" containerName="init" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.346215 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9c5eaf-257a-4346-8516-a48a9a2398aa" containerName="init" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.346417 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e9c5eaf-257a-4346-8516-a48a9a2398aa" containerName="dnsmasq-dns" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.346434 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ae2d40-0434-41f0-8dcc-f5e67063a428" containerName="dnsmasq-dns" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.347085 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.348623 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.349304 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.349755 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.352714 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.352923 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx"] Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.510972 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.511026 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.511065 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9jhp\" (UniqueName: \"kubernetes.io/projected/c3cdb156-f67f-4dd2-b04d-9fb263802321-kube-api-access-w9jhp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.511086 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.613011 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.613064 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.613096 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9jhp\" (UniqueName: \"kubernetes.io/projected/c3cdb156-f67f-4dd2-b04d-9fb263802321-kube-api-access-w9jhp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.613119 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.616462 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.616912 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.617622 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.629051 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9jhp\" (UniqueName: \"kubernetes.io/projected/c3cdb156-f67f-4dd2-b04d-9fb263802321-kube-api-access-w9jhp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:28 crc kubenswrapper[4563]: I1124 09:22:28.758541 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:29 crc kubenswrapper[4563]: W1124 09:22:29.209308 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3cdb156_f67f_4dd2_b04d_9fb263802321.slice/crio-4f1a2b08b36f18e90b21ed470007c768cc0366caf843141fdcd0720eb43a8063 WatchSource:0}: Error finding container 4f1a2b08b36f18e90b21ed470007c768cc0366caf843141fdcd0720eb43a8063: Status 404 returned error can't find the container with id 4f1a2b08b36f18e90b21ed470007c768cc0366caf843141fdcd0720eb43a8063 Nov 24 09:22:29 crc kubenswrapper[4563]: I1124 09:22:29.210690 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx"] Nov 24 09:22:29 crc kubenswrapper[4563]: I1124 09:22:29.289589 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62faa658-2c71-4afe-9fc2-4d9fd0079928","Type":"ContainerStarted","Data":"bb7dddb100f64caa903eeaba2aafce82bbf8f48f9b5caffd27d1d22d1e24468f"} Nov 24 09:22:29 crc kubenswrapper[4563]: I1124 09:22:29.290648 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:22:29 crc kubenswrapper[4563]: I1124 09:22:29.292683 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"45003ba2-beec-43e7-9248-42c517ed3bf7","Type":"ContainerStarted","Data":"18cc1a0297a7f090942effdc9167d0df63a7bedfbcad98c812d3e64fb2487938"} Nov 24 09:22:29 crc kubenswrapper[4563]: I1124 09:22:29.292874 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 24 09:22:29 crc kubenswrapper[4563]: I1124 09:22:29.294345 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" event={"ID":"c3cdb156-f67f-4dd2-b04d-9fb263802321","Type":"ContainerStarted","Data":"4f1a2b08b36f18e90b21ed470007c768cc0366caf843141fdcd0720eb43a8063"} Nov 24 09:22:29 crc kubenswrapper[4563]: I1124 09:22:29.313847 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.313833161 podStartE2EDuration="35.313833161s" podCreationTimestamp="2025-11-24 09:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:22:29.309414899 +0000 UTC m=+1126.568392345" watchObservedRunningTime="2025-11-24 09:22:29.313833161 +0000 UTC m=+1126.572810608" Nov 24 09:22:29 crc kubenswrapper[4563]: I1124 09:22:29.350363 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.350337314 podStartE2EDuration="35.350337314s" podCreationTimestamp="2025-11-24 09:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:22:29.346658256 +0000 UTC m=+1126.605635713" watchObservedRunningTime="2025-11-24 09:22:29.350337314 +0000 UTC m=+1126.609314761" Nov 24 09:22:37 crc kubenswrapper[4563]: I1124 09:22:37.377835 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" event={"ID":"c3cdb156-f67f-4dd2-b04d-9fb263802321","Type":"ContainerStarted","Data":"e719609563c298ecf50d0be54d47c9f859a36213aa324979ac208cea8ebc7786"} Nov 24 09:22:37 crc kubenswrapper[4563]: I1124 09:22:37.400900 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" podStartSLOduration=2.268558586 podStartE2EDuration="9.400881811s" podCreationTimestamp="2025-11-24 09:22:28 +0000 UTC" firstStartedPulling="2025-11-24 09:22:29.211551369 +0000 UTC m=+1126.470528816" lastFinishedPulling="2025-11-24 09:22:36.343874594 +0000 UTC m=+1133.602852041" observedRunningTime="2025-11-24 09:22:37.393888573 +0000 UTC m=+1134.652866020" watchObservedRunningTime="2025-11-24 09:22:37.400881811 +0000 UTC m=+1134.659859259" Nov 24 09:22:44 crc kubenswrapper[4563]: I1124 09:22:44.509874 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 24 09:22:44 crc kubenswrapper[4563]: I1124 09:22:44.698881 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 24 09:22:48 crc kubenswrapper[4563]: I1124 09:22:48.472347 4563 generic.go:334] "Generic (PLEG): container finished" podID="c3cdb156-f67f-4dd2-b04d-9fb263802321" containerID="e719609563c298ecf50d0be54d47c9f859a36213aa324979ac208cea8ebc7786" exitCode=0 Nov 24 09:22:48 crc kubenswrapper[4563]: I1124 09:22:48.472423 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" event={"ID":"c3cdb156-f67f-4dd2-b04d-9fb263802321","Type":"ContainerDied","Data":"e719609563c298ecf50d0be54d47c9f859a36213aa324979ac208cea8ebc7786"} Nov 24 09:22:49 crc kubenswrapper[4563]: I1124 09:22:49.850574 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.044453 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-repo-setup-combined-ca-bundle\") pod \"c3cdb156-f67f-4dd2-b04d-9fb263802321\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.044940 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-ssh-key\") pod \"c3cdb156-f67f-4dd2-b04d-9fb263802321\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.045204 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-inventory\") pod \"c3cdb156-f67f-4dd2-b04d-9fb263802321\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.045274 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9jhp\" (UniqueName: \"kubernetes.io/projected/c3cdb156-f67f-4dd2-b04d-9fb263802321-kube-api-access-w9jhp\") pod \"c3cdb156-f67f-4dd2-b04d-9fb263802321\" (UID: \"c3cdb156-f67f-4dd2-b04d-9fb263802321\") " Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.052965 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c3cdb156-f67f-4dd2-b04d-9fb263802321" (UID: "c3cdb156-f67f-4dd2-b04d-9fb263802321"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.053191 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3cdb156-f67f-4dd2-b04d-9fb263802321-kube-api-access-w9jhp" (OuterVolumeSpecName: "kube-api-access-w9jhp") pod "c3cdb156-f67f-4dd2-b04d-9fb263802321" (UID: "c3cdb156-f67f-4dd2-b04d-9fb263802321"). InnerVolumeSpecName "kube-api-access-w9jhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.073967 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-inventory" (OuterVolumeSpecName: "inventory") pod "c3cdb156-f67f-4dd2-b04d-9fb263802321" (UID: "c3cdb156-f67f-4dd2-b04d-9fb263802321"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.073992 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c3cdb156-f67f-4dd2-b04d-9fb263802321" (UID: "c3cdb156-f67f-4dd2-b04d-9fb263802321"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.147030 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.147152 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9jhp\" (UniqueName: \"kubernetes.io/projected/c3cdb156-f67f-4dd2-b04d-9fb263802321-kube-api-access-w9jhp\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.147268 4563 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.147315 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c3cdb156-f67f-4dd2-b04d-9fb263802321-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.489542 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" event={"ID":"c3cdb156-f67f-4dd2-b04d-9fb263802321","Type":"ContainerDied","Data":"4f1a2b08b36f18e90b21ed470007c768cc0366caf843141fdcd0720eb43a8063"} Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.489602 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f1a2b08b36f18e90b21ed470007c768cc0366caf843141fdcd0720eb43a8063" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.489620 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.586055 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw"] Nov 24 09:22:50 crc kubenswrapper[4563]: E1124 09:22:50.586479 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3cdb156-f67f-4dd2-b04d-9fb263802321" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.586492 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3cdb156-f67f-4dd2-b04d-9fb263802321" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.586712 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3cdb156-f67f-4dd2-b04d-9fb263802321" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.587340 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.593533 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.594987 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.595112 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.595188 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.598786 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw"] Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.657141 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96dc1f15-b31a-4eb6-91e7-35b341f1347a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tmkfw\" (UID: \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.657271 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdpd9\" (UniqueName: \"kubernetes.io/projected/96dc1f15-b31a-4eb6-91e7-35b341f1347a-kube-api-access-hdpd9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tmkfw\" (UID: \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.657466 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96dc1f15-b31a-4eb6-91e7-35b341f1347a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tmkfw\" (UID: \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.758765 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96dc1f15-b31a-4eb6-91e7-35b341f1347a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tmkfw\" (UID: \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.758842 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdpd9\" (UniqueName: \"kubernetes.io/projected/96dc1f15-b31a-4eb6-91e7-35b341f1347a-kube-api-access-hdpd9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tmkfw\" (UID: \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.758909 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96dc1f15-b31a-4eb6-91e7-35b341f1347a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tmkfw\" (UID: \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.763331 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96dc1f15-b31a-4eb6-91e7-35b341f1347a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tmkfw\" (UID: \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.763331 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96dc1f15-b31a-4eb6-91e7-35b341f1347a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tmkfw\" (UID: \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.774600 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdpd9\" (UniqueName: \"kubernetes.io/projected/96dc1f15-b31a-4eb6-91e7-35b341f1347a-kube-api-access-hdpd9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-tmkfw\" (UID: \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" Nov 24 09:22:50 crc kubenswrapper[4563]: I1124 09:22:50.902089 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" Nov 24 09:22:51 crc kubenswrapper[4563]: I1124 09:22:51.374499 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw"] Nov 24 09:22:51 crc kubenswrapper[4563]: I1124 09:22:51.377238 4563 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 09:22:51 crc kubenswrapper[4563]: I1124 09:22:51.499693 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" event={"ID":"96dc1f15-b31a-4eb6-91e7-35b341f1347a","Type":"ContainerStarted","Data":"cb9ed363dad856f78d3bef2734071b108bb9d81a4b814dc86abba7799939b4c5"} Nov 24 09:22:52 crc kubenswrapper[4563]: I1124 09:22:52.512552 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" event={"ID":"96dc1f15-b31a-4eb6-91e7-35b341f1347a","Type":"ContainerStarted","Data":"a26a9cc958698b2e72d29adf378982c2d790c136addba62f36a5145b801f49e6"} Nov 24 09:22:52 crc kubenswrapper[4563]: I1124 09:22:52.534524 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" podStartSLOduration=2.030992314 podStartE2EDuration="2.53450866s" podCreationTimestamp="2025-11-24 09:22:50 +0000 UTC" firstStartedPulling="2025-11-24 09:22:51.376968168 +0000 UTC m=+1148.635945615" lastFinishedPulling="2025-11-24 09:22:51.880484514 +0000 UTC m=+1149.139461961" observedRunningTime="2025-11-24 09:22:52.525532883 +0000 UTC m=+1149.784510330" watchObservedRunningTime="2025-11-24 09:22:52.53450866 +0000 UTC m=+1149.793486107" Nov 24 09:22:54 crc kubenswrapper[4563]: I1124 09:22:54.542198 4563 generic.go:334] "Generic (PLEG): container finished" podID="96dc1f15-b31a-4eb6-91e7-35b341f1347a" containerID="a26a9cc958698b2e72d29adf378982c2d790c136addba62f36a5145b801f49e6" exitCode=0 Nov 24 09:22:54 crc kubenswrapper[4563]: I1124 09:22:54.542285 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" event={"ID":"96dc1f15-b31a-4eb6-91e7-35b341f1347a","Type":"ContainerDied","Data":"a26a9cc958698b2e72d29adf378982c2d790c136addba62f36a5145b801f49e6"} Nov 24 09:22:55 crc kubenswrapper[4563]: I1124 09:22:55.910861 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.067632 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96dc1f15-b31a-4eb6-91e7-35b341f1347a-ssh-key\") pod \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\" (UID: \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\") " Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.067849 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96dc1f15-b31a-4eb6-91e7-35b341f1347a-inventory\") pod \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\" (UID: \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\") " Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.068125 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdpd9\" (UniqueName: \"kubernetes.io/projected/96dc1f15-b31a-4eb6-91e7-35b341f1347a-kube-api-access-hdpd9\") pod \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\" (UID: \"96dc1f15-b31a-4eb6-91e7-35b341f1347a\") " Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.073799 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96dc1f15-b31a-4eb6-91e7-35b341f1347a-kube-api-access-hdpd9" (OuterVolumeSpecName: "kube-api-access-hdpd9") pod "96dc1f15-b31a-4eb6-91e7-35b341f1347a" (UID: "96dc1f15-b31a-4eb6-91e7-35b341f1347a"). InnerVolumeSpecName "kube-api-access-hdpd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.092875 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96dc1f15-b31a-4eb6-91e7-35b341f1347a-inventory" (OuterVolumeSpecName: "inventory") pod "96dc1f15-b31a-4eb6-91e7-35b341f1347a" (UID: "96dc1f15-b31a-4eb6-91e7-35b341f1347a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.094206 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96dc1f15-b31a-4eb6-91e7-35b341f1347a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "96dc1f15-b31a-4eb6-91e7-35b341f1347a" (UID: "96dc1f15-b31a-4eb6-91e7-35b341f1347a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.173183 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdpd9\" (UniqueName: \"kubernetes.io/projected/96dc1f15-b31a-4eb6-91e7-35b341f1347a-kube-api-access-hdpd9\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.173219 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96dc1f15-b31a-4eb6-91e7-35b341f1347a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.173231 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96dc1f15-b31a-4eb6-91e7-35b341f1347a-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.562452 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" event={"ID":"96dc1f15-b31a-4eb6-91e7-35b341f1347a","Type":"ContainerDied","Data":"cb9ed363dad856f78d3bef2734071b108bb9d81a4b814dc86abba7799939b4c5"} Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.562504 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb9ed363dad856f78d3bef2734071b108bb9d81a4b814dc86abba7799939b4c5" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.562528 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-tmkfw" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.693042 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2"] Nov 24 09:22:56 crc kubenswrapper[4563]: E1124 09:22:56.693429 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96dc1f15-b31a-4eb6-91e7-35b341f1347a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.693448 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="96dc1f15-b31a-4eb6-91e7-35b341f1347a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.693690 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="96dc1f15-b31a-4eb6-91e7-35b341f1347a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.694284 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.698492 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.698771 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.699027 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.699027 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.725998 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2"] Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.783475 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.783615 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.783679 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.783706 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2289\" (UniqueName: \"kubernetes.io/projected/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-kube-api-access-d2289\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.885909 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.886028 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.886073 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2289\" (UniqueName: \"kubernetes.io/projected/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-kube-api-access-d2289\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.886177 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.891472 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.892123 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.892377 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:22:56 crc kubenswrapper[4563]: I1124 09:22:56.902215 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2289\" (UniqueName: \"kubernetes.io/projected/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-kube-api-access-d2289\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:22:57 crc kubenswrapper[4563]: I1124 09:22:57.007391 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:22:57 crc kubenswrapper[4563]: I1124 09:22:57.483155 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2"] Nov 24 09:22:57 crc kubenswrapper[4563]: I1124 09:22:57.572897 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" event={"ID":"1c07ab91-ccc4-46d0-b15c-0d20675fc19a","Type":"ContainerStarted","Data":"ac652a07e30b30c254094f6b9e7049e9c5d8e78f0b5d76db578ab4ba43ac53ac"} Nov 24 09:22:58 crc kubenswrapper[4563]: I1124 09:22:58.581886 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" event={"ID":"1c07ab91-ccc4-46d0-b15c-0d20675fc19a","Type":"ContainerStarted","Data":"d1e4bfc4981e60c914923ce00ead9250be7d7d7a4137a48361f9d29784000a97"} Nov 24 09:22:58 crc kubenswrapper[4563]: I1124 09:22:58.601775 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" podStartSLOduration=2.097216205 podStartE2EDuration="2.601754307s" podCreationTimestamp="2025-11-24 09:22:56 +0000 UTC" firstStartedPulling="2025-11-24 09:22:57.488933041 +0000 UTC m=+1154.747910487" lastFinishedPulling="2025-11-24 09:22:57.993471143 +0000 UTC m=+1155.252448589" observedRunningTime="2025-11-24 09:22:58.595352213 +0000 UTC m=+1155.854329660" watchObservedRunningTime="2025-11-24 09:22:58.601754307 +0000 UTC m=+1155.860731754" Nov 24 09:23:38 crc kubenswrapper[4563]: I1124 09:23:38.988022 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:23:38 crc kubenswrapper[4563]: I1124 09:23:38.988465 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:24:08 crc kubenswrapper[4563]: I1124 09:24:08.987733 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:24:08 crc kubenswrapper[4563]: I1124 09:24:08.988717 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:24:38 crc kubenswrapper[4563]: I1124 09:24:38.987427 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:24:38 crc kubenswrapper[4563]: I1124 09:24:38.987825 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:24:38 crc kubenswrapper[4563]: I1124 09:24:38.987872 4563 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:24:38 crc kubenswrapper[4563]: I1124 09:24:38.988321 4563 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"910d483568e80e7c5051043679fd4f1476c6c059619a02146f6b5a112281d18f"} pod="openshift-machine-config-operator/machine-config-daemon-stlxr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 09:24:38 crc kubenswrapper[4563]: I1124 09:24:38.988388 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" containerID="cri-o://910d483568e80e7c5051043679fd4f1476c6c059619a02146f6b5a112281d18f" gracePeriod=600 Nov 24 09:24:39 crc kubenswrapper[4563]: I1124 09:24:39.375945 4563 generic.go:334] "Generic (PLEG): container finished" podID="3b2bfe55-8989-49b3-bb61-e28189447627" containerID="910d483568e80e7c5051043679fd4f1476c6c059619a02146f6b5a112281d18f" exitCode=0 Nov 24 09:24:39 crc kubenswrapper[4563]: I1124 09:24:39.375994 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerDied","Data":"910d483568e80e7c5051043679fd4f1476c6c059619a02146f6b5a112281d18f"} Nov 24 09:24:39 crc kubenswrapper[4563]: I1124 09:24:39.376166 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"4d8f48825147068e682924024fb98e71a696f2055c921253ed4d8afbad01ed41"} Nov 24 09:24:39 crc kubenswrapper[4563]: I1124 09:24:39.376194 4563 scope.go:117] "RemoveContainer" containerID="03f875f88eef557bff28f5ed0a9f361fdac2df81584f5d9cbef7e181ab4ba280" Nov 24 09:24:51 crc kubenswrapper[4563]: I1124 09:24:51.004443 4563 scope.go:117] "RemoveContainer" containerID="30a4d5df861b3290bb9630c2ee99164697fb288bb158d589ff153afed7599d4d" Nov 24 09:24:51 crc kubenswrapper[4563]: I1124 09:24:51.033587 4563 scope.go:117] "RemoveContainer" containerID="d9d022498ed0865c43bb0bed4db10fba9b82ea28d4406bb092ba58ed744d314c" Nov 24 09:24:51 crc kubenswrapper[4563]: I1124 09:24:51.064609 4563 scope.go:117] "RemoveContainer" containerID="7fc5adf7e7dd334c85a9401b77c6e1ba3a7cc1d64970081f9bb4a7c460841bbf" Nov 24 09:25:50 crc kubenswrapper[4563]: I1124 09:25:50.939066 4563 generic.go:334] "Generic (PLEG): container finished" podID="1c07ab91-ccc4-46d0-b15c-0d20675fc19a" containerID="d1e4bfc4981e60c914923ce00ead9250be7d7d7a4137a48361f9d29784000a97" exitCode=0 Nov 24 09:25:50 crc kubenswrapper[4563]: I1124 09:25:50.940174 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" event={"ID":"1c07ab91-ccc4-46d0-b15c-0d20675fc19a","Type":"ContainerDied","Data":"d1e4bfc4981e60c914923ce00ead9250be7d7d7a4137a48361f9d29784000a97"} Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.255299 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.295011 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-ssh-key\") pod \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.295119 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-inventory\") pod \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.295164 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2289\" (UniqueName: \"kubernetes.io/projected/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-kube-api-access-d2289\") pod \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.295225 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-bootstrap-combined-ca-bundle\") pod \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\" (UID: \"1c07ab91-ccc4-46d0-b15c-0d20675fc19a\") " Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.303783 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-kube-api-access-d2289" (OuterVolumeSpecName: "kube-api-access-d2289") pod "1c07ab91-ccc4-46d0-b15c-0d20675fc19a" (UID: "1c07ab91-ccc4-46d0-b15c-0d20675fc19a"). InnerVolumeSpecName "kube-api-access-d2289". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.303790 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1c07ab91-ccc4-46d0-b15c-0d20675fc19a" (UID: "1c07ab91-ccc4-46d0-b15c-0d20675fc19a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.319109 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1c07ab91-ccc4-46d0-b15c-0d20675fc19a" (UID: "1c07ab91-ccc4-46d0-b15c-0d20675fc19a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.319234 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-inventory" (OuterVolumeSpecName: "inventory") pod "1c07ab91-ccc4-46d0-b15c-0d20675fc19a" (UID: "1c07ab91-ccc4-46d0-b15c-0d20675fc19a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.397846 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2289\" (UniqueName: \"kubernetes.io/projected/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-kube-api-access-d2289\") on node \"crc\" DevicePath \"\"" Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.397891 4563 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.397905 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.397916 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c07ab91-ccc4-46d0-b15c-0d20675fc19a-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.961282 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" event={"ID":"1c07ab91-ccc4-46d0-b15c-0d20675fc19a","Type":"ContainerDied","Data":"ac652a07e30b30c254094f6b9e7049e9c5d8e78f0b5d76db578ab4ba43ac53ac"} Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.961621 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac652a07e30b30c254094f6b9e7049e9c5d8e78f0b5d76db578ab4ba43ac53ac" Nov 24 09:25:52 crc kubenswrapper[4563]: I1124 09:25:52.961328 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.031620 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz"] Nov 24 09:25:53 crc kubenswrapper[4563]: E1124 09:25:53.032670 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c07ab91-ccc4-46d0-b15c-0d20675fc19a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.032690 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c07ab91-ccc4-46d0-b15c-0d20675fc19a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.032936 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c07ab91-ccc4-46d0-b15c-0d20675fc19a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.033757 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.035465 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.037817 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.037856 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.038191 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.050447 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz"] Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.211962 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmzcj\" (UniqueName: \"kubernetes.io/projected/75442289-63cd-4b6c-b86d-70ab08ae8dc2-kube-api-access-tmzcj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-srkpz\" (UID: \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.212068 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75442289-63cd-4b6c-b86d-70ab08ae8dc2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-srkpz\" (UID: \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.212139 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75442289-63cd-4b6c-b86d-70ab08ae8dc2-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-srkpz\" (UID: \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.314107 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmzcj\" (UniqueName: \"kubernetes.io/projected/75442289-63cd-4b6c-b86d-70ab08ae8dc2-kube-api-access-tmzcj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-srkpz\" (UID: \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.314241 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75442289-63cd-4b6c-b86d-70ab08ae8dc2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-srkpz\" (UID: \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.314358 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75442289-63cd-4b6c-b86d-70ab08ae8dc2-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-srkpz\" (UID: \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.320439 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75442289-63cd-4b6c-b86d-70ab08ae8dc2-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-srkpz\" (UID: \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.320700 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75442289-63cd-4b6c-b86d-70ab08ae8dc2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-srkpz\" (UID: \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.329169 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmzcj\" (UniqueName: \"kubernetes.io/projected/75442289-63cd-4b6c-b86d-70ab08ae8dc2-kube-api-access-tmzcj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-srkpz\" (UID: \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.349421 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.810787 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz"] Nov 24 09:25:53 crc kubenswrapper[4563]: I1124 09:25:53.971386 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" event={"ID":"75442289-63cd-4b6c-b86d-70ab08ae8dc2","Type":"ContainerStarted","Data":"da76a810910ae78549a787fa1dfcd3b482b57cdf3a2526ae9d9f2578feb3c74e"} Nov 24 09:25:54 crc kubenswrapper[4563]: I1124 09:25:54.981117 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" event={"ID":"75442289-63cd-4b6c-b86d-70ab08ae8dc2","Type":"ContainerStarted","Data":"c64bf3bd505101d0768232a745276b3a6958b369d0fbed6b4d74e2bba58c5d62"} Nov 24 09:25:54 crc kubenswrapper[4563]: I1124 09:25:54.998661 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" podStartSLOduration=1.514933402 podStartE2EDuration="1.998629323s" podCreationTimestamp="2025-11-24 09:25:53 +0000 UTC" firstStartedPulling="2025-11-24 09:25:53.815774575 +0000 UTC m=+1331.074752023" lastFinishedPulling="2025-11-24 09:25:54.299470497 +0000 UTC m=+1331.558447944" observedRunningTime="2025-11-24 09:25:54.993671779 +0000 UTC m=+1332.252649226" watchObservedRunningTime="2025-11-24 09:25:54.998629323 +0000 UTC m=+1332.257606770" Nov 24 09:26:51 crc kubenswrapper[4563]: I1124 09:26:51.153542 4563 scope.go:117] "RemoveContainer" containerID="daf4829fa7ff43fc586194d9c583f00284566907b2991030a438f4e5c5c3ebaf" Nov 24 09:27:08 crc kubenswrapper[4563]: I1124 09:27:08.987133 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:27:08 crc kubenswrapper[4563]: I1124 09:27:08.987710 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:27:25 crc kubenswrapper[4563]: I1124 09:27:25.712965 4563 generic.go:334] "Generic (PLEG): container finished" podID="75442289-63cd-4b6c-b86d-70ab08ae8dc2" containerID="c64bf3bd505101d0768232a745276b3a6958b369d0fbed6b4d74e2bba58c5d62" exitCode=0 Nov 24 09:27:25 crc kubenswrapper[4563]: I1124 09:27:25.713053 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" event={"ID":"75442289-63cd-4b6c-b86d-70ab08ae8dc2","Type":"ContainerDied","Data":"c64bf3bd505101d0768232a745276b3a6958b369d0fbed6b4d74e2bba58c5d62"} Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.033297 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.221516 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75442289-63cd-4b6c-b86d-70ab08ae8dc2-ssh-key\") pod \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\" (UID: \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\") " Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.221829 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmzcj\" (UniqueName: \"kubernetes.io/projected/75442289-63cd-4b6c-b86d-70ab08ae8dc2-kube-api-access-tmzcj\") pod \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\" (UID: \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\") " Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.222265 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75442289-63cd-4b6c-b86d-70ab08ae8dc2-inventory\") pod \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\" (UID: \"75442289-63cd-4b6c-b86d-70ab08ae8dc2\") " Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.226944 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75442289-63cd-4b6c-b86d-70ab08ae8dc2-kube-api-access-tmzcj" (OuterVolumeSpecName: "kube-api-access-tmzcj") pod "75442289-63cd-4b6c-b86d-70ab08ae8dc2" (UID: "75442289-63cd-4b6c-b86d-70ab08ae8dc2"). InnerVolumeSpecName "kube-api-access-tmzcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.242626 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75442289-63cd-4b6c-b86d-70ab08ae8dc2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "75442289-63cd-4b6c-b86d-70ab08ae8dc2" (UID: "75442289-63cd-4b6c-b86d-70ab08ae8dc2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.248232 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75442289-63cd-4b6c-b86d-70ab08ae8dc2-inventory" (OuterVolumeSpecName: "inventory") pod "75442289-63cd-4b6c-b86d-70ab08ae8dc2" (UID: "75442289-63cd-4b6c-b86d-70ab08ae8dc2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.325380 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75442289-63cd-4b6c-b86d-70ab08ae8dc2-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.325428 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/75442289-63cd-4b6c-b86d-70ab08ae8dc2-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.325439 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmzcj\" (UniqueName: \"kubernetes.io/projected/75442289-63cd-4b6c-b86d-70ab08ae8dc2-kube-api-access-tmzcj\") on node \"crc\" DevicePath \"\"" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.738685 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" event={"ID":"75442289-63cd-4b6c-b86d-70ab08ae8dc2","Type":"ContainerDied","Data":"da76a810910ae78549a787fa1dfcd3b482b57cdf3a2526ae9d9f2578feb3c74e"} Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.738728 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-srkpz" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.738729 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da76a810910ae78549a787fa1dfcd3b482b57cdf3a2526ae9d9f2578feb3c74e" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.800354 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj"] Nov 24 09:27:27 crc kubenswrapper[4563]: E1124 09:27:27.800910 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75442289-63cd-4b6c-b86d-70ab08ae8dc2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.800935 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="75442289-63cd-4b6c-b86d-70ab08ae8dc2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.801225 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="75442289-63cd-4b6c-b86d-70ab08ae8dc2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.802076 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.803937 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.804378 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.804565 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.804810 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.808608 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj"] Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.933837 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea6d045d-1394-436f-9329-9f3a9d10610b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj\" (UID: \"ea6d045d-1394-436f-9329-9f3a9d10610b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.933927 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea6d045d-1394-436f-9329-9f3a9d10610b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj\" (UID: \"ea6d045d-1394-436f-9329-9f3a9d10610b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" Nov 24 09:27:27 crc kubenswrapper[4563]: I1124 09:27:27.933958 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnhnm\" (UniqueName: \"kubernetes.io/projected/ea6d045d-1394-436f-9329-9f3a9d10610b-kube-api-access-bnhnm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj\" (UID: \"ea6d045d-1394-436f-9329-9f3a9d10610b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" Nov 24 09:27:28 crc kubenswrapper[4563]: I1124 09:27:28.035825 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea6d045d-1394-436f-9329-9f3a9d10610b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj\" (UID: \"ea6d045d-1394-436f-9329-9f3a9d10610b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" Nov 24 09:27:28 crc kubenswrapper[4563]: I1124 09:27:28.035909 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea6d045d-1394-436f-9329-9f3a9d10610b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj\" (UID: \"ea6d045d-1394-436f-9329-9f3a9d10610b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" Nov 24 09:27:28 crc kubenswrapper[4563]: I1124 09:27:28.035941 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnhnm\" (UniqueName: \"kubernetes.io/projected/ea6d045d-1394-436f-9329-9f3a9d10610b-kube-api-access-bnhnm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj\" (UID: \"ea6d045d-1394-436f-9329-9f3a9d10610b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" Nov 24 09:27:28 crc kubenswrapper[4563]: I1124 09:27:28.041208 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea6d045d-1394-436f-9329-9f3a9d10610b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj\" (UID: \"ea6d045d-1394-436f-9329-9f3a9d10610b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" Nov 24 09:27:28 crc kubenswrapper[4563]: I1124 09:27:28.041813 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea6d045d-1394-436f-9329-9f3a9d10610b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj\" (UID: \"ea6d045d-1394-436f-9329-9f3a9d10610b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" Nov 24 09:27:28 crc kubenswrapper[4563]: I1124 09:27:28.051779 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnhnm\" (UniqueName: \"kubernetes.io/projected/ea6d045d-1394-436f-9329-9f3a9d10610b-kube-api-access-bnhnm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj\" (UID: \"ea6d045d-1394-436f-9329-9f3a9d10610b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" Nov 24 09:27:28 crc kubenswrapper[4563]: I1124 09:27:28.117352 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" Nov 24 09:27:28 crc kubenswrapper[4563]: I1124 09:27:28.562437 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj"] Nov 24 09:27:28 crc kubenswrapper[4563]: I1124 09:27:28.748101 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" event={"ID":"ea6d045d-1394-436f-9329-9f3a9d10610b","Type":"ContainerStarted","Data":"4c112e1422d3c6ff33ec809b48a289d8465ba13f14aab75e4cd7e42345ace642"} Nov 24 09:27:29 crc kubenswrapper[4563]: I1124 09:27:29.757973 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" event={"ID":"ea6d045d-1394-436f-9329-9f3a9d10610b","Type":"ContainerStarted","Data":"b6ea7d2c74e1ba15dd88e95db97d4ecb79866bfaa06c88bbfd2ac613fc460771"} Nov 24 09:27:29 crc kubenswrapper[4563]: I1124 09:27:29.774681 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" podStartSLOduration=2.2380482170000002 podStartE2EDuration="2.774664422s" podCreationTimestamp="2025-11-24 09:27:27 +0000 UTC" firstStartedPulling="2025-11-24 09:27:28.564056382 +0000 UTC m=+1425.823033830" lastFinishedPulling="2025-11-24 09:27:29.100672588 +0000 UTC m=+1426.359650035" observedRunningTime="2025-11-24 09:27:29.770252388 +0000 UTC m=+1427.029229835" watchObservedRunningTime="2025-11-24 09:27:29.774664422 +0000 UTC m=+1427.033641869" Nov 24 09:27:38 crc kubenswrapper[4563]: I1124 09:27:38.987193 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:27:38 crc kubenswrapper[4563]: I1124 09:27:38.987670 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:27:49 crc kubenswrapper[4563]: I1124 09:27:49.033460 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2820-account-create-rzsg6"] Nov 24 09:27:49 crc kubenswrapper[4563]: I1124 09:27:49.042900 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-2c9nb"] Nov 24 09:27:49 crc kubenswrapper[4563]: I1124 09:27:49.051374 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2820-account-create-rzsg6"] Nov 24 09:27:49 crc kubenswrapper[4563]: I1124 09:27:49.068820 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6072ef52-6143-434b-b56f-06e3d11c966f" path="/var/lib/kubelet/pods/6072ef52-6143-434b-b56f-06e3d11c966f/volumes" Nov 24 09:27:49 crc kubenswrapper[4563]: I1124 09:27:49.069470 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-2snzz"] Nov 24 09:27:49 crc kubenswrapper[4563]: I1124 09:27:49.077056 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-aa19-account-create-bp4rh"] Nov 24 09:27:49 crc kubenswrapper[4563]: I1124 09:27:49.080424 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-2c9nb"] Nov 24 09:27:49 crc kubenswrapper[4563]: I1124 09:27:49.086842 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-aa19-account-create-bp4rh"] Nov 24 09:27:49 crc kubenswrapper[4563]: I1124 09:27:49.092794 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-2snzz"] Nov 24 09:27:51 crc kubenswrapper[4563]: I1124 09:27:51.063133 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="579d7285-3560-4795-8a38-516fd67df1f4" path="/var/lib/kubelet/pods/579d7285-3560-4795-8a38-516fd67df1f4/volumes" Nov 24 09:27:51 crc kubenswrapper[4563]: I1124 09:27:51.064062 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6005814-b899-4e79-816e-c51ffbe41a91" path="/var/lib/kubelet/pods/b6005814-b899-4e79-816e-c51ffbe41a91/volumes" Nov 24 09:27:51 crc kubenswrapper[4563]: I1124 09:27:51.064631 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8c3184f-c96d-4339-8d4c-af9b3fa9a04d" path="/var/lib/kubelet/pods/e8c3184f-c96d-4339-8d4c-af9b3fa9a04d/volumes" Nov 24 09:27:51 crc kubenswrapper[4563]: I1124 09:27:51.196150 4563 scope.go:117] "RemoveContainer" containerID="2573716fbde1d07a7b809aa931d164137b737d6fe6779f91fe62bbdbb764872a" Nov 24 09:27:51 crc kubenswrapper[4563]: I1124 09:27:51.213206 4563 scope.go:117] "RemoveContainer" containerID="fab1e793dd14f0e6449009b3d828296a7e058bb38a079b344e74f0e2b72b702b" Nov 24 09:27:51 crc kubenswrapper[4563]: I1124 09:27:51.250502 4563 scope.go:117] "RemoveContainer" containerID="9b592cd0dae33a2878daf2d9d6185ad96e65f221e44d94981a0809304730d61e" Nov 24 09:27:51 crc kubenswrapper[4563]: I1124 09:27:51.280896 4563 scope.go:117] "RemoveContainer" containerID="7c3e6205926990d76ca04675ed9f5ffc5ff948f981d63a7df9649ea1af2208b6" Nov 24 09:27:51 crc kubenswrapper[4563]: I1124 09:27:51.310301 4563 scope.go:117] "RemoveContainer" containerID="6d7d5cb51b9fda5a5f8a5b4baaedf01d35e6707e1dcf24a352b6d310937270ea" Nov 24 09:27:51 crc kubenswrapper[4563]: I1124 09:27:51.332939 4563 scope.go:117] "RemoveContainer" containerID="d7c7a153898120b18ac46a4295647a7c3f657ef8b1786ddea6ed4b60e30dc468" Nov 24 09:27:56 crc kubenswrapper[4563]: I1124 09:27:56.025904 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9dd3-account-create-wwmtc"] Nov 24 09:27:56 crc kubenswrapper[4563]: I1124 09:27:56.033394 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-j4vjc"] Nov 24 09:27:56 crc kubenswrapper[4563]: I1124 09:27:56.040100 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-j4vjc"] Nov 24 09:27:56 crc kubenswrapper[4563]: I1124 09:27:56.045972 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9dd3-account-create-wwmtc"] Nov 24 09:27:57 crc kubenswrapper[4563]: I1124 09:27:57.063751 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="411e07f4-60ab-4c24-835e-8c677e121702" path="/var/lib/kubelet/pods/411e07f4-60ab-4c24-835e-8c677e121702/volumes" Nov 24 09:27:57 crc kubenswrapper[4563]: I1124 09:27:57.064508 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4df8208-ad24-49df-bd0b-dfb181a9269e" path="/var/lib/kubelet/pods/f4df8208-ad24-49df-bd0b-dfb181a9269e/volumes" Nov 24 09:28:08 crc kubenswrapper[4563]: I1124 09:28:08.987537 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:28:08 crc kubenswrapper[4563]: I1124 09:28:08.988053 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:28:08 crc kubenswrapper[4563]: I1124 09:28:08.988111 4563 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:28:08 crc kubenswrapper[4563]: I1124 09:28:08.989047 4563 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d8f48825147068e682924024fb98e71a696f2055c921253ed4d8afbad01ed41"} pod="openshift-machine-config-operator/machine-config-daemon-stlxr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 09:28:08 crc kubenswrapper[4563]: I1124 09:28:08.989113 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" containerID="cri-o://4d8f48825147068e682924024fb98e71a696f2055c921253ed4d8afbad01ed41" gracePeriod=600 Nov 24 09:28:10 crc kubenswrapper[4563]: I1124 09:28:10.089004 4563 generic.go:334] "Generic (PLEG): container finished" podID="3b2bfe55-8989-49b3-bb61-e28189447627" containerID="4d8f48825147068e682924024fb98e71a696f2055c921253ed4d8afbad01ed41" exitCode=0 Nov 24 09:28:10 crc kubenswrapper[4563]: I1124 09:28:10.089099 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerDied","Data":"4d8f48825147068e682924024fb98e71a696f2055c921253ed4d8afbad01ed41"} Nov 24 09:28:10 crc kubenswrapper[4563]: I1124 09:28:10.089562 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512"} Nov 24 09:28:10 crc kubenswrapper[4563]: I1124 09:28:10.089585 4563 scope.go:117] "RemoveContainer" containerID="910d483568e80e7c5051043679fd4f1476c6c059619a02146f6b5a112281d18f" Nov 24 09:28:17 crc kubenswrapper[4563]: I1124 09:28:17.031274 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c035-account-create-q6pbl"] Nov 24 09:28:17 crc kubenswrapper[4563]: I1124 09:28:17.037827 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-k9nfs"] Nov 24 09:28:17 crc kubenswrapper[4563]: I1124 09:28:17.044537 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-c035-account-create-q6pbl"] Nov 24 09:28:17 crc kubenswrapper[4563]: I1124 09:28:17.049931 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-k9nfs"] Nov 24 09:28:17 crc kubenswrapper[4563]: I1124 09:28:17.064690 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550" path="/var/lib/kubelet/pods/4e633cdc-e6d0-4bb6-bfe2-c6c1f1fef550/volumes" Nov 24 09:28:17 crc kubenswrapper[4563]: I1124 09:28:17.065522 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e93314-6c63-4319-bc2d-7ef3c6b917ec" path="/var/lib/kubelet/pods/a0e93314-6c63-4319-bc2d-7ef3c6b917ec/volumes" Nov 24 09:28:18 crc kubenswrapper[4563]: I1124 09:28:18.024066 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4b13-account-create-4626m"] Nov 24 09:28:18 crc kubenswrapper[4563]: I1124 09:28:18.031348 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-l585m"] Nov 24 09:28:18 crc kubenswrapper[4563]: I1124 09:28:18.038660 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d52e-account-create-p7ns9"] Nov 24 09:28:18 crc kubenswrapper[4563]: I1124 09:28:18.043706 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-t4vt4"] Nov 24 09:28:18 crc kubenswrapper[4563]: I1124 09:28:18.048923 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4b13-account-create-4626m"] Nov 24 09:28:18 crc kubenswrapper[4563]: I1124 09:28:18.054262 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-l585m"] Nov 24 09:28:18 crc kubenswrapper[4563]: I1124 09:28:18.058947 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d52e-account-create-p7ns9"] Nov 24 09:28:18 crc kubenswrapper[4563]: I1124 09:28:18.064013 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-t4vt4"] Nov 24 09:28:19 crc kubenswrapper[4563]: I1124 09:28:19.063530 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26761262-f331-4bde-8b02-ff48aa5f3875" path="/var/lib/kubelet/pods/26761262-f331-4bde-8b02-ff48aa5f3875/volumes" Nov 24 09:28:19 crc kubenswrapper[4563]: I1124 09:28:19.064278 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fd85b6a-57be-47d8-955d-187926600e97" path="/var/lib/kubelet/pods/4fd85b6a-57be-47d8-955d-187926600e97/volumes" Nov 24 09:28:19 crc kubenswrapper[4563]: I1124 09:28:19.064769 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76c84c12-1726-494a-91ab-598ce15287ae" path="/var/lib/kubelet/pods/76c84c12-1726-494a-91ab-598ce15287ae/volumes" Nov 24 09:28:19 crc kubenswrapper[4563]: I1124 09:28:19.065255 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db3a0f98-33b9-450d-91e4-6575a60cfb2e" path="/var/lib/kubelet/pods/db3a0f98-33b9-450d-91e4-6575a60cfb2e/volumes" Nov 24 09:28:20 crc kubenswrapper[4563]: I1124 09:28:20.023240 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-6msdh"] Nov 24 09:28:20 crc kubenswrapper[4563]: I1124 09:28:20.029386 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-6msdh"] Nov 24 09:28:21 crc kubenswrapper[4563]: I1124 09:28:21.064225 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20286861-2552-4ff4-a5a1-3d67c9e0cb7b" path="/var/lib/kubelet/pods/20286861-2552-4ff4-a5a1-3d67c9e0cb7b/volumes" Nov 24 09:28:24 crc kubenswrapper[4563]: I1124 09:28:24.023411 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-4w2gs"] Nov 24 09:28:24 crc kubenswrapper[4563]: I1124 09:28:24.030005 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-4w2gs"] Nov 24 09:28:25 crc kubenswrapper[4563]: I1124 09:28:25.064775 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1190dd8-6aef-4116-be7f-e498cfe0db11" path="/var/lib/kubelet/pods/d1190dd8-6aef-4116-be7f-e498cfe0db11/volumes" Nov 24 09:28:34 crc kubenswrapper[4563]: I1124 09:28:34.305220 4563 generic.go:334] "Generic (PLEG): container finished" podID="ea6d045d-1394-436f-9329-9f3a9d10610b" containerID="b6ea7d2c74e1ba15dd88e95db97d4ecb79866bfaa06c88bbfd2ac613fc460771" exitCode=0 Nov 24 09:28:34 crc kubenswrapper[4563]: I1124 09:28:34.305288 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" event={"ID":"ea6d045d-1394-436f-9329-9f3a9d10610b","Type":"ContainerDied","Data":"b6ea7d2c74e1ba15dd88e95db97d4ecb79866bfaa06c88bbfd2ac613fc460771"} Nov 24 09:28:35 crc kubenswrapper[4563]: I1124 09:28:35.639088 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" Nov 24 09:28:35 crc kubenswrapper[4563]: I1124 09:28:35.810905 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea6d045d-1394-436f-9329-9f3a9d10610b-inventory\") pod \"ea6d045d-1394-436f-9329-9f3a9d10610b\" (UID: \"ea6d045d-1394-436f-9329-9f3a9d10610b\") " Nov 24 09:28:35 crc kubenswrapper[4563]: I1124 09:28:35.811105 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea6d045d-1394-436f-9329-9f3a9d10610b-ssh-key\") pod \"ea6d045d-1394-436f-9329-9f3a9d10610b\" (UID: \"ea6d045d-1394-436f-9329-9f3a9d10610b\") " Nov 24 09:28:35 crc kubenswrapper[4563]: I1124 09:28:35.811128 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnhnm\" (UniqueName: \"kubernetes.io/projected/ea6d045d-1394-436f-9329-9f3a9d10610b-kube-api-access-bnhnm\") pod \"ea6d045d-1394-436f-9329-9f3a9d10610b\" (UID: \"ea6d045d-1394-436f-9329-9f3a9d10610b\") " Nov 24 09:28:35 crc kubenswrapper[4563]: I1124 09:28:35.815724 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6d045d-1394-436f-9329-9f3a9d10610b-kube-api-access-bnhnm" (OuterVolumeSpecName: "kube-api-access-bnhnm") pod "ea6d045d-1394-436f-9329-9f3a9d10610b" (UID: "ea6d045d-1394-436f-9329-9f3a9d10610b"). InnerVolumeSpecName "kube-api-access-bnhnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:28:35 crc kubenswrapper[4563]: I1124 09:28:35.831180 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea6d045d-1394-436f-9329-9f3a9d10610b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ea6d045d-1394-436f-9329-9f3a9d10610b" (UID: "ea6d045d-1394-436f-9329-9f3a9d10610b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:28:35 crc kubenswrapper[4563]: I1124 09:28:35.831438 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea6d045d-1394-436f-9329-9f3a9d10610b-inventory" (OuterVolumeSpecName: "inventory") pod "ea6d045d-1394-436f-9329-9f3a9d10610b" (UID: "ea6d045d-1394-436f-9329-9f3a9d10610b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:28:35 crc kubenswrapper[4563]: I1124 09:28:35.914205 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea6d045d-1394-436f-9329-9f3a9d10610b-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:28:35 crc kubenswrapper[4563]: I1124 09:28:35.914231 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnhnm\" (UniqueName: \"kubernetes.io/projected/ea6d045d-1394-436f-9329-9f3a9d10610b-kube-api-access-bnhnm\") on node \"crc\" DevicePath \"\"" Nov 24 09:28:35 crc kubenswrapper[4563]: I1124 09:28:35.914243 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea6d045d-1394-436f-9329-9f3a9d10610b-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.323407 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" event={"ID":"ea6d045d-1394-436f-9329-9f3a9d10610b","Type":"ContainerDied","Data":"4c112e1422d3c6ff33ec809b48a289d8465ba13f14aab75e4cd7e42345ace642"} Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.323449 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c112e1422d3c6ff33ec809b48a289d8465ba13f14aab75e4cd7e42345ace642" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.323449 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.383372 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm"] Nov 24 09:28:36 crc kubenswrapper[4563]: E1124 09:28:36.384039 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6d045d-1394-436f-9329-9f3a9d10610b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.384070 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6d045d-1394-436f-9329-9f3a9d10610b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.384356 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6d045d-1394-436f-9329-9f3a9d10610b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.385064 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.392448 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.392542 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.392822 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm"] Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.392919 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.395103 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.524835 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb4jn\" (UniqueName: \"kubernetes.io/projected/72560768-c189-4eaa-9128-486ec369275b-kube-api-access-wb4jn\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm\" (UID: \"72560768-c189-4eaa-9128-486ec369275b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.524974 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72560768-c189-4eaa-9128-486ec369275b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm\" (UID: \"72560768-c189-4eaa-9128-486ec369275b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.525242 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72560768-c189-4eaa-9128-486ec369275b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm\" (UID: \"72560768-c189-4eaa-9128-486ec369275b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.626795 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72560768-c189-4eaa-9128-486ec369275b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm\" (UID: \"72560768-c189-4eaa-9128-486ec369275b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.626882 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb4jn\" (UniqueName: \"kubernetes.io/projected/72560768-c189-4eaa-9128-486ec369275b-kube-api-access-wb4jn\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm\" (UID: \"72560768-c189-4eaa-9128-486ec369275b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.627471 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72560768-c189-4eaa-9128-486ec369275b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm\" (UID: \"72560768-c189-4eaa-9128-486ec369275b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.630732 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72560768-c189-4eaa-9128-486ec369275b-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm\" (UID: \"72560768-c189-4eaa-9128-486ec369275b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.631485 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72560768-c189-4eaa-9128-486ec369275b-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm\" (UID: \"72560768-c189-4eaa-9128-486ec369275b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.641876 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb4jn\" (UniqueName: \"kubernetes.io/projected/72560768-c189-4eaa-9128-486ec369275b-kube-api-access-wb4jn\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm\" (UID: \"72560768-c189-4eaa-9128-486ec369275b\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" Nov 24 09:28:36 crc kubenswrapper[4563]: I1124 09:28:36.698621 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" Nov 24 09:28:37 crc kubenswrapper[4563]: I1124 09:28:37.131656 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm"] Nov 24 09:28:37 crc kubenswrapper[4563]: I1124 09:28:37.136222 4563 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 09:28:37 crc kubenswrapper[4563]: I1124 09:28:37.333233 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" event={"ID":"72560768-c189-4eaa-9128-486ec369275b","Type":"ContainerStarted","Data":"b0feca48f2fe4eab90775734ee99d9cb48cfdcf8b63bc929d2866bbde0b9688c"} Nov 24 09:28:38 crc kubenswrapper[4563]: I1124 09:28:38.358178 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" event={"ID":"72560768-c189-4eaa-9128-486ec369275b","Type":"ContainerStarted","Data":"4e13f0685b6ccdf29cb67bce07fad0110292d8dd9844bbc3b819448c52586040"} Nov 24 09:28:38 crc kubenswrapper[4563]: I1124 09:28:38.372768 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" podStartSLOduration=1.777320647 podStartE2EDuration="2.372754558s" podCreationTimestamp="2025-11-24 09:28:36 +0000 UTC" firstStartedPulling="2025-11-24 09:28:37.135989029 +0000 UTC m=+1494.394966475" lastFinishedPulling="2025-11-24 09:28:37.73142294 +0000 UTC m=+1494.990400386" observedRunningTime="2025-11-24 09:28:38.369677525 +0000 UTC m=+1495.628654972" watchObservedRunningTime="2025-11-24 09:28:38.372754558 +0000 UTC m=+1495.631732005" Nov 24 09:28:41 crc kubenswrapper[4563]: I1124 09:28:41.381145 4563 generic.go:334] "Generic (PLEG): container finished" podID="72560768-c189-4eaa-9128-486ec369275b" containerID="4e13f0685b6ccdf29cb67bce07fad0110292d8dd9844bbc3b819448c52586040" exitCode=0 Nov 24 09:28:41 crc kubenswrapper[4563]: I1124 09:28:41.381215 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" event={"ID":"72560768-c189-4eaa-9128-486ec369275b","Type":"ContainerDied","Data":"4e13f0685b6ccdf29cb67bce07fad0110292d8dd9844bbc3b819448c52586040"} Nov 24 09:28:42 crc kubenswrapper[4563]: I1124 09:28:42.707357 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" Nov 24 09:28:42 crc kubenswrapper[4563]: I1124 09:28:42.820968 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72560768-c189-4eaa-9128-486ec369275b-ssh-key\") pod \"72560768-c189-4eaa-9128-486ec369275b\" (UID: \"72560768-c189-4eaa-9128-486ec369275b\") " Nov 24 09:28:42 crc kubenswrapper[4563]: I1124 09:28:42.821182 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72560768-c189-4eaa-9128-486ec369275b-inventory\") pod \"72560768-c189-4eaa-9128-486ec369275b\" (UID: \"72560768-c189-4eaa-9128-486ec369275b\") " Nov 24 09:28:42 crc kubenswrapper[4563]: I1124 09:28:42.821219 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb4jn\" (UniqueName: \"kubernetes.io/projected/72560768-c189-4eaa-9128-486ec369275b-kube-api-access-wb4jn\") pod \"72560768-c189-4eaa-9128-486ec369275b\" (UID: \"72560768-c189-4eaa-9128-486ec369275b\") " Nov 24 09:28:42 crc kubenswrapper[4563]: I1124 09:28:42.825732 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72560768-c189-4eaa-9128-486ec369275b-kube-api-access-wb4jn" (OuterVolumeSpecName: "kube-api-access-wb4jn") pod "72560768-c189-4eaa-9128-486ec369275b" (UID: "72560768-c189-4eaa-9128-486ec369275b"). InnerVolumeSpecName "kube-api-access-wb4jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:28:42 crc kubenswrapper[4563]: I1124 09:28:42.842757 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72560768-c189-4eaa-9128-486ec369275b-inventory" (OuterVolumeSpecName: "inventory") pod "72560768-c189-4eaa-9128-486ec369275b" (UID: "72560768-c189-4eaa-9128-486ec369275b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:28:42 crc kubenswrapper[4563]: I1124 09:28:42.843627 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72560768-c189-4eaa-9128-486ec369275b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "72560768-c189-4eaa-9128-486ec369275b" (UID: "72560768-c189-4eaa-9128-486ec369275b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:28:42 crc kubenswrapper[4563]: I1124 09:28:42.923619 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72560768-c189-4eaa-9128-486ec369275b-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:28:42 crc kubenswrapper[4563]: I1124 09:28:42.923664 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72560768-c189-4eaa-9128-486ec369275b-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:28:42 crc kubenswrapper[4563]: I1124 09:28:42.923676 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb4jn\" (UniqueName: \"kubernetes.io/projected/72560768-c189-4eaa-9128-486ec369275b-kube-api-access-wb4jn\") on node \"crc\" DevicePath \"\"" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.395987 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" event={"ID":"72560768-c189-4eaa-9128-486ec369275b","Type":"ContainerDied","Data":"b0feca48f2fe4eab90775734ee99d9cb48cfdcf8b63bc929d2866bbde0b9688c"} Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.396236 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0feca48f2fe4eab90775734ee99d9cb48cfdcf8b63bc929d2866bbde0b9688c" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.396271 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.442072 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr"] Nov 24 09:28:43 crc kubenswrapper[4563]: E1124 09:28:43.442397 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72560768-c189-4eaa-9128-486ec369275b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.442415 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="72560768-c189-4eaa-9128-486ec369275b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.442610 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="72560768-c189-4eaa-9128-486ec369275b" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.443178 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.444851 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.444971 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.445171 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.451896 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.452462 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr"] Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.636441 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjrgf\" (UniqueName: \"kubernetes.io/projected/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-kube-api-access-vjrgf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4bvkr\" (UID: \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.636704 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4bvkr\" (UID: \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.636827 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4bvkr\" (UID: \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.738122 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4bvkr\" (UID: \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.738225 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4bvkr\" (UID: \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.738329 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjrgf\" (UniqueName: \"kubernetes.io/projected/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-kube-api-access-vjrgf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4bvkr\" (UID: \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.742282 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4bvkr\" (UID: \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.742483 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4bvkr\" (UID: \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.751176 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjrgf\" (UniqueName: \"kubernetes.io/projected/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-kube-api-access-vjrgf\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4bvkr\" (UID: \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" Nov 24 09:28:43 crc kubenswrapper[4563]: I1124 09:28:43.757287 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" Nov 24 09:28:44 crc kubenswrapper[4563]: I1124 09:28:44.178126 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr"] Nov 24 09:28:44 crc kubenswrapper[4563]: I1124 09:28:44.406815 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" event={"ID":"424c4e83-a3c6-4eea-958e-e0cf83f20fdf","Type":"ContainerStarted","Data":"865dbb4d745f5f7b27d25b0446ea1402069d3d72a145cc220cd917551d9953ce"} Nov 24 09:28:45 crc kubenswrapper[4563]: I1124 09:28:45.416500 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" event={"ID":"424c4e83-a3c6-4eea-958e-e0cf83f20fdf","Type":"ContainerStarted","Data":"2f1f4fd6127c27b7e203626472086cc4bfff1080224d42ff636a457a7451a8f7"} Nov 24 09:28:45 crc kubenswrapper[4563]: I1124 09:28:45.433326 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" podStartSLOduration=1.952755748 podStartE2EDuration="2.433311413s" podCreationTimestamp="2025-11-24 09:28:43 +0000 UTC" firstStartedPulling="2025-11-24 09:28:44.182143533 +0000 UTC m=+1501.441120981" lastFinishedPulling="2025-11-24 09:28:44.662699198 +0000 UTC m=+1501.921676646" observedRunningTime="2025-11-24 09:28:45.426833006 +0000 UTC m=+1502.685810452" watchObservedRunningTime="2025-11-24 09:28:45.433311413 +0000 UTC m=+1502.692288860" Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.034853 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-ngwrz"] Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.043393 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-ngwrz"] Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.190471 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zm56j"] Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.192387 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.202451 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zm56j"] Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.323112 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56d2060b-db38-4a2d-8a04-362d9ba093c3-utilities\") pod \"redhat-operators-zm56j\" (UID: \"56d2060b-db38-4a2d-8a04-362d9ba093c3\") " pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.323239 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tnqh\" (UniqueName: \"kubernetes.io/projected/56d2060b-db38-4a2d-8a04-362d9ba093c3-kube-api-access-4tnqh\") pod \"redhat-operators-zm56j\" (UID: \"56d2060b-db38-4a2d-8a04-362d9ba093c3\") " pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.323596 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56d2060b-db38-4a2d-8a04-362d9ba093c3-catalog-content\") pod \"redhat-operators-zm56j\" (UID: \"56d2060b-db38-4a2d-8a04-362d9ba093c3\") " pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.425394 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56d2060b-db38-4a2d-8a04-362d9ba093c3-catalog-content\") pod \"redhat-operators-zm56j\" (UID: \"56d2060b-db38-4a2d-8a04-362d9ba093c3\") " pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.425807 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56d2060b-db38-4a2d-8a04-362d9ba093c3-utilities\") pod \"redhat-operators-zm56j\" (UID: \"56d2060b-db38-4a2d-8a04-362d9ba093c3\") " pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.425858 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tnqh\" (UniqueName: \"kubernetes.io/projected/56d2060b-db38-4a2d-8a04-362d9ba093c3-kube-api-access-4tnqh\") pod \"redhat-operators-zm56j\" (UID: \"56d2060b-db38-4a2d-8a04-362d9ba093c3\") " pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.425862 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56d2060b-db38-4a2d-8a04-362d9ba093c3-catalog-content\") pod \"redhat-operators-zm56j\" (UID: \"56d2060b-db38-4a2d-8a04-362d9ba093c3\") " pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.426184 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56d2060b-db38-4a2d-8a04-362d9ba093c3-utilities\") pod \"redhat-operators-zm56j\" (UID: \"56d2060b-db38-4a2d-8a04-362d9ba093c3\") " pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.458038 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tnqh\" (UniqueName: \"kubernetes.io/projected/56d2060b-db38-4a2d-8a04-362d9ba093c3-kube-api-access-4tnqh\") pod \"redhat-operators-zm56j\" (UID: \"56d2060b-db38-4a2d-8a04-362d9ba093c3\") " pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.509002 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:28:48 crc kubenswrapper[4563]: I1124 09:28:48.907138 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zm56j"] Nov 24 09:28:49 crc kubenswrapper[4563]: I1124 09:28:49.064516 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d128ad0e-d5fb-4f46-a737-b68fb15b7dc6" path="/var/lib/kubelet/pods/d128ad0e-d5fb-4f46-a737-b68fb15b7dc6/volumes" Nov 24 09:28:49 crc kubenswrapper[4563]: I1124 09:28:49.451160 4563 generic.go:334] "Generic (PLEG): container finished" podID="56d2060b-db38-4a2d-8a04-362d9ba093c3" containerID="267692549abc9379f2c9e4a5618e4727852839471e53bde19aeae1e70a4fb8ff" exitCode=0 Nov 24 09:28:49 crc kubenswrapper[4563]: I1124 09:28:49.451200 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm56j" event={"ID":"56d2060b-db38-4a2d-8a04-362d9ba093c3","Type":"ContainerDied","Data":"267692549abc9379f2c9e4a5618e4727852839471e53bde19aeae1e70a4fb8ff"} Nov 24 09:28:49 crc kubenswrapper[4563]: I1124 09:28:49.451243 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm56j" event={"ID":"56d2060b-db38-4a2d-8a04-362d9ba093c3","Type":"ContainerStarted","Data":"cff422e7a45378305e7c780499a4a4ca94379fa21a11362243dc76be4cc4a395"} Nov 24 09:28:50 crc kubenswrapper[4563]: I1124 09:28:50.460899 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm56j" event={"ID":"56d2060b-db38-4a2d-8a04-362d9ba093c3","Type":"ContainerStarted","Data":"5b4a28a8e78701e1f7847ddb164c8b2e64ab61f10fd62be59dcabbdd4482d881"} Nov 24 09:28:51 crc kubenswrapper[4563]: I1124 09:28:51.428006 4563 scope.go:117] "RemoveContainer" containerID="c8e054827b3388db0230a41a37b488fbd7caa049e0c9f4174587f47878f6e3a4" Nov 24 09:28:51 crc kubenswrapper[4563]: I1124 09:28:51.447372 4563 scope.go:117] "RemoveContainer" containerID="38580ef16e0473e5a0968c04ed2852b6e8f41ec8ad01508bb9978bc7b660f66d" Nov 24 09:28:51 crc kubenswrapper[4563]: I1124 09:28:51.483113 4563 scope.go:117] "RemoveContainer" containerID="1eca9adbe2a535a1d64ad8ac8ca6fbde84a77d6dcc8242798f4621e2b7608740" Nov 24 09:28:51 crc kubenswrapper[4563]: I1124 09:28:51.515926 4563 scope.go:117] "RemoveContainer" containerID="cd9a6e0bb9cf33eeb070fb668ea6ead3754b52490ba66ba3b99f0717f840c0b9" Nov 24 09:28:51 crc kubenswrapper[4563]: I1124 09:28:51.563436 4563 scope.go:117] "RemoveContainer" containerID="d3c46f2138d7e9e9519753124f1b46db0e7ec960bdf134ed0955e1c1a103c617" Nov 24 09:28:51 crc kubenswrapper[4563]: I1124 09:28:51.598996 4563 scope.go:117] "RemoveContainer" containerID="289ebe41425706f766743fcb4109dbe0797f68d0730b502ebdd960199f36d52b" Nov 24 09:28:51 crc kubenswrapper[4563]: I1124 09:28:51.622557 4563 scope.go:117] "RemoveContainer" containerID="0bb3afca1b8eb24e5748089f4bde24844a4d21f3f69bc044ff2f09fa889e9655" Nov 24 09:28:51 crc kubenswrapper[4563]: I1124 09:28:51.643199 4563 scope.go:117] "RemoveContainer" containerID="cf36b861866fe18ca83c30e811f07dfc6934b6b4d377c37c561572c96f7bfff6" Nov 24 09:28:51 crc kubenswrapper[4563]: I1124 09:28:51.674718 4563 scope.go:117] "RemoveContainer" containerID="d5da6cab5264481835c5f9fa0d133666f92336aa7a6601d942d575db8d3bd9d4" Nov 24 09:28:51 crc kubenswrapper[4563]: I1124 09:28:51.694099 4563 scope.go:117] "RemoveContainer" containerID="dc931fdc1a3b3855e7604863182b2f20b24224eeace1c806902e5b4024b07382" Nov 24 09:28:51 crc kubenswrapper[4563]: I1124 09:28:51.711883 4563 scope.go:117] "RemoveContainer" containerID="c1f9e4633c7081ec9d9e372f4eac02386363375477b604311b4e4cdde2fdaba4" Nov 24 09:28:53 crc kubenswrapper[4563]: I1124 09:28:53.491389 4563 generic.go:334] "Generic (PLEG): container finished" podID="56d2060b-db38-4a2d-8a04-362d9ba093c3" containerID="5b4a28a8e78701e1f7847ddb164c8b2e64ab61f10fd62be59dcabbdd4482d881" exitCode=0 Nov 24 09:28:53 crc kubenswrapper[4563]: I1124 09:28:53.491572 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm56j" event={"ID":"56d2060b-db38-4a2d-8a04-362d9ba093c3","Type":"ContainerDied","Data":"5b4a28a8e78701e1f7847ddb164c8b2e64ab61f10fd62be59dcabbdd4482d881"} Nov 24 09:28:54 crc kubenswrapper[4563]: I1124 09:28:54.503350 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm56j" event={"ID":"56d2060b-db38-4a2d-8a04-362d9ba093c3","Type":"ContainerStarted","Data":"ce3822864260fcd7eb21104ed5f718eec4777823726a124a66ae152de963268f"} Nov 24 09:28:54 crc kubenswrapper[4563]: I1124 09:28:54.523614 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zm56j" podStartSLOduration=1.997368733 podStartE2EDuration="6.523599468s" podCreationTimestamp="2025-11-24 09:28:48 +0000 UTC" firstStartedPulling="2025-11-24 09:28:49.452671425 +0000 UTC m=+1506.711648872" lastFinishedPulling="2025-11-24 09:28:53.97890216 +0000 UTC m=+1511.237879607" observedRunningTime="2025-11-24 09:28:54.516500221 +0000 UTC m=+1511.775477667" watchObservedRunningTime="2025-11-24 09:28:54.523599468 +0000 UTC m=+1511.782576915" Nov 24 09:28:58 crc kubenswrapper[4563]: I1124 09:28:58.027455 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jh96b"] Nov 24 09:28:58 crc kubenswrapper[4563]: I1124 09:28:58.035546 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jh96b"] Nov 24 09:28:58 crc kubenswrapper[4563]: I1124 09:28:58.510145 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:28:58 crc kubenswrapper[4563]: I1124 09:28:58.510203 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:28:59 crc kubenswrapper[4563]: I1124 09:28:59.063682 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="539b9102-6a58-4804-8b35-4b183ef45c82" path="/var/lib/kubelet/pods/539b9102-6a58-4804-8b35-4b183ef45c82/volumes" Nov 24 09:28:59 crc kubenswrapper[4563]: I1124 09:28:59.545947 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zm56j" podUID="56d2060b-db38-4a2d-8a04-362d9ba093c3" containerName="registry-server" probeResult="failure" output=< Nov 24 09:28:59 crc kubenswrapper[4563]: timeout: failed to connect service ":50051" within 1s Nov 24 09:28:59 crc kubenswrapper[4563]: > Nov 24 09:29:00 crc kubenswrapper[4563]: I1124 09:29:00.189671 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kj2m6"] Nov 24 09:29:00 crc kubenswrapper[4563]: I1124 09:29:00.191997 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:00 crc kubenswrapper[4563]: I1124 09:29:00.215040 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kj2m6"] Nov 24 09:29:00 crc kubenswrapper[4563]: I1124 09:29:00.250244 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee1e81c9-585e-4c08-b893-a6f74aa84138-catalog-content\") pod \"redhat-marketplace-kj2m6\" (UID: \"ee1e81c9-585e-4c08-b893-a6f74aa84138\") " pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:00 crc kubenswrapper[4563]: I1124 09:29:00.250383 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee1e81c9-585e-4c08-b893-a6f74aa84138-utilities\") pod \"redhat-marketplace-kj2m6\" (UID: \"ee1e81c9-585e-4c08-b893-a6f74aa84138\") " pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:00 crc kubenswrapper[4563]: I1124 09:29:00.250403 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nrsj\" (UniqueName: \"kubernetes.io/projected/ee1e81c9-585e-4c08-b893-a6f74aa84138-kube-api-access-5nrsj\") pod \"redhat-marketplace-kj2m6\" (UID: \"ee1e81c9-585e-4c08-b893-a6f74aa84138\") " pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:00 crc kubenswrapper[4563]: I1124 09:29:00.352690 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee1e81c9-585e-4c08-b893-a6f74aa84138-catalog-content\") pod \"redhat-marketplace-kj2m6\" (UID: \"ee1e81c9-585e-4c08-b893-a6f74aa84138\") " pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:00 crc kubenswrapper[4563]: I1124 09:29:00.352875 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee1e81c9-585e-4c08-b893-a6f74aa84138-utilities\") pod \"redhat-marketplace-kj2m6\" (UID: \"ee1e81c9-585e-4c08-b893-a6f74aa84138\") " pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:00 crc kubenswrapper[4563]: I1124 09:29:00.352898 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nrsj\" (UniqueName: \"kubernetes.io/projected/ee1e81c9-585e-4c08-b893-a6f74aa84138-kube-api-access-5nrsj\") pod \"redhat-marketplace-kj2m6\" (UID: \"ee1e81c9-585e-4c08-b893-a6f74aa84138\") " pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:00 crc kubenswrapper[4563]: I1124 09:29:00.353180 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee1e81c9-585e-4c08-b893-a6f74aa84138-catalog-content\") pod \"redhat-marketplace-kj2m6\" (UID: \"ee1e81c9-585e-4c08-b893-a6f74aa84138\") " pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:00 crc kubenswrapper[4563]: I1124 09:29:00.353265 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee1e81c9-585e-4c08-b893-a6f74aa84138-utilities\") pod \"redhat-marketplace-kj2m6\" (UID: \"ee1e81c9-585e-4c08-b893-a6f74aa84138\") " pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:00 crc kubenswrapper[4563]: I1124 09:29:00.371212 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nrsj\" (UniqueName: \"kubernetes.io/projected/ee1e81c9-585e-4c08-b893-a6f74aa84138-kube-api-access-5nrsj\") pod \"redhat-marketplace-kj2m6\" (UID: \"ee1e81c9-585e-4c08-b893-a6f74aa84138\") " pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:00 crc kubenswrapper[4563]: I1124 09:29:00.508131 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:00 crc kubenswrapper[4563]: I1124 09:29:00.925430 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kj2m6"] Nov 24 09:29:01 crc kubenswrapper[4563]: I1124 09:29:01.557205 4563 generic.go:334] "Generic (PLEG): container finished" podID="ee1e81c9-585e-4c08-b893-a6f74aa84138" containerID="db6aae2070efa72533f807a2abf3f6a8bd8f926c7a394be7b47ceead730cdfd5" exitCode=0 Nov 24 09:29:01 crc kubenswrapper[4563]: I1124 09:29:01.557399 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kj2m6" event={"ID":"ee1e81c9-585e-4c08-b893-a6f74aa84138","Type":"ContainerDied","Data":"db6aae2070efa72533f807a2abf3f6a8bd8f926c7a394be7b47ceead730cdfd5"} Nov 24 09:29:01 crc kubenswrapper[4563]: I1124 09:29:01.557489 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kj2m6" event={"ID":"ee1e81c9-585e-4c08-b893-a6f74aa84138","Type":"ContainerStarted","Data":"67beceae60ec2ab05b28f4fb33bb790d73f730ab88436474db4ada5e7f4fc4b6"} Nov 24 09:29:02 crc kubenswrapper[4563]: I1124 09:29:02.566186 4563 generic.go:334] "Generic (PLEG): container finished" podID="ee1e81c9-585e-4c08-b893-a6f74aa84138" containerID="7a7089f9c51e1f92aeaef539b16ab537ba8e9ab93bafbaed94b4feb902b408ab" exitCode=0 Nov 24 09:29:02 crc kubenswrapper[4563]: I1124 09:29:02.566281 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kj2m6" event={"ID":"ee1e81c9-585e-4c08-b893-a6f74aa84138","Type":"ContainerDied","Data":"7a7089f9c51e1f92aeaef539b16ab537ba8e9ab93bafbaed94b4feb902b408ab"} Nov 24 09:29:03 crc kubenswrapper[4563]: I1124 09:29:03.580477 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kj2m6" event={"ID":"ee1e81c9-585e-4c08-b893-a6f74aa84138","Type":"ContainerStarted","Data":"9af3c51b53d26f3831db3effa2f91f537d899669b5dd7c3ff05c7395e79afcc8"} Nov 24 09:29:03 crc kubenswrapper[4563]: I1124 09:29:03.599941 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kj2m6" podStartSLOduration=2.111857951 podStartE2EDuration="3.599924671s" podCreationTimestamp="2025-11-24 09:29:00 +0000 UTC" firstStartedPulling="2025-11-24 09:29:01.559563054 +0000 UTC m=+1518.818540501" lastFinishedPulling="2025-11-24 09:29:03.047629774 +0000 UTC m=+1520.306607221" observedRunningTime="2025-11-24 09:29:03.594528535 +0000 UTC m=+1520.853505982" watchObservedRunningTime="2025-11-24 09:29:03.599924671 +0000 UTC m=+1520.858902119" Nov 24 09:29:08 crc kubenswrapper[4563]: I1124 09:29:08.547266 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:29:08 crc kubenswrapper[4563]: I1124 09:29:08.579809 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:29:08 crc kubenswrapper[4563]: I1124 09:29:08.778541 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zm56j"] Nov 24 09:29:09 crc kubenswrapper[4563]: I1124 09:29:09.626053 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zm56j" podUID="56d2060b-db38-4a2d-8a04-362d9ba093c3" containerName="registry-server" containerID="cri-o://ce3822864260fcd7eb21104ed5f718eec4777823726a124a66ae152de963268f" gracePeriod=2 Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.011269 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.039524 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56d2060b-db38-4a2d-8a04-362d9ba093c3-utilities\") pod \"56d2060b-db38-4a2d-8a04-362d9ba093c3\" (UID: \"56d2060b-db38-4a2d-8a04-362d9ba093c3\") " Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.039581 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56d2060b-db38-4a2d-8a04-362d9ba093c3-catalog-content\") pod \"56d2060b-db38-4a2d-8a04-362d9ba093c3\" (UID: \"56d2060b-db38-4a2d-8a04-362d9ba093c3\") " Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.040751 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56d2060b-db38-4a2d-8a04-362d9ba093c3-utilities" (OuterVolumeSpecName: "utilities") pod "56d2060b-db38-4a2d-8a04-362d9ba093c3" (UID: "56d2060b-db38-4a2d-8a04-362d9ba093c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.106461 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56d2060b-db38-4a2d-8a04-362d9ba093c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56d2060b-db38-4a2d-8a04-362d9ba093c3" (UID: "56d2060b-db38-4a2d-8a04-362d9ba093c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.141712 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tnqh\" (UniqueName: \"kubernetes.io/projected/56d2060b-db38-4a2d-8a04-362d9ba093c3-kube-api-access-4tnqh\") pod \"56d2060b-db38-4a2d-8a04-362d9ba093c3\" (UID: \"56d2060b-db38-4a2d-8a04-362d9ba093c3\") " Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.142242 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56d2060b-db38-4a2d-8a04-362d9ba093c3-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.142268 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56d2060b-db38-4a2d-8a04-362d9ba093c3-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.148337 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56d2060b-db38-4a2d-8a04-362d9ba093c3-kube-api-access-4tnqh" (OuterVolumeSpecName: "kube-api-access-4tnqh") pod "56d2060b-db38-4a2d-8a04-362d9ba093c3" (UID: "56d2060b-db38-4a2d-8a04-362d9ba093c3"). InnerVolumeSpecName "kube-api-access-4tnqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.244055 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tnqh\" (UniqueName: \"kubernetes.io/projected/56d2060b-db38-4a2d-8a04-362d9ba093c3-kube-api-access-4tnqh\") on node \"crc\" DevicePath \"\"" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.508695 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.509145 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.545828 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.637694 4563 generic.go:334] "Generic (PLEG): container finished" podID="424c4e83-a3c6-4eea-958e-e0cf83f20fdf" containerID="2f1f4fd6127c27b7e203626472086cc4bfff1080224d42ff636a457a7451a8f7" exitCode=0 Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.637777 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" event={"ID":"424c4e83-a3c6-4eea-958e-e0cf83f20fdf","Type":"ContainerDied","Data":"2f1f4fd6127c27b7e203626472086cc4bfff1080224d42ff636a457a7451a8f7"} Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.640998 4563 generic.go:334] "Generic (PLEG): container finished" podID="56d2060b-db38-4a2d-8a04-362d9ba093c3" containerID="ce3822864260fcd7eb21104ed5f718eec4777823726a124a66ae152de963268f" exitCode=0 Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.641029 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm56j" event={"ID":"56d2060b-db38-4a2d-8a04-362d9ba093c3","Type":"ContainerDied","Data":"ce3822864260fcd7eb21104ed5f718eec4777823726a124a66ae152de963268f"} Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.641073 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm56j" event={"ID":"56d2060b-db38-4a2d-8a04-362d9ba093c3","Type":"ContainerDied","Data":"cff422e7a45378305e7c780499a4a4ca94379fa21a11362243dc76be4cc4a395"} Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.641096 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zm56j" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.641108 4563 scope.go:117] "RemoveContainer" containerID="ce3822864260fcd7eb21104ed5f718eec4777823726a124a66ae152de963268f" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.664375 4563 scope.go:117] "RemoveContainer" containerID="5b4a28a8e78701e1f7847ddb164c8b2e64ab61f10fd62be59dcabbdd4482d881" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.672228 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zm56j"] Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.677617 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zm56j"] Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.687332 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.694783 4563 scope.go:117] "RemoveContainer" containerID="267692549abc9379f2c9e4a5618e4727852839471e53bde19aeae1e70a4fb8ff" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.737964 4563 scope.go:117] "RemoveContainer" containerID="ce3822864260fcd7eb21104ed5f718eec4777823726a124a66ae152de963268f" Nov 24 09:29:10 crc kubenswrapper[4563]: E1124 09:29:10.738383 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce3822864260fcd7eb21104ed5f718eec4777823726a124a66ae152de963268f\": container with ID starting with ce3822864260fcd7eb21104ed5f718eec4777823726a124a66ae152de963268f not found: ID does not exist" containerID="ce3822864260fcd7eb21104ed5f718eec4777823726a124a66ae152de963268f" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.738417 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce3822864260fcd7eb21104ed5f718eec4777823726a124a66ae152de963268f"} err="failed to get container status \"ce3822864260fcd7eb21104ed5f718eec4777823726a124a66ae152de963268f\": rpc error: code = NotFound desc = could not find container \"ce3822864260fcd7eb21104ed5f718eec4777823726a124a66ae152de963268f\": container with ID starting with ce3822864260fcd7eb21104ed5f718eec4777823726a124a66ae152de963268f not found: ID does not exist" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.738440 4563 scope.go:117] "RemoveContainer" containerID="5b4a28a8e78701e1f7847ddb164c8b2e64ab61f10fd62be59dcabbdd4482d881" Nov 24 09:29:10 crc kubenswrapper[4563]: E1124 09:29:10.738829 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b4a28a8e78701e1f7847ddb164c8b2e64ab61f10fd62be59dcabbdd4482d881\": container with ID starting with 5b4a28a8e78701e1f7847ddb164c8b2e64ab61f10fd62be59dcabbdd4482d881 not found: ID does not exist" containerID="5b4a28a8e78701e1f7847ddb164c8b2e64ab61f10fd62be59dcabbdd4482d881" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.738853 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b4a28a8e78701e1f7847ddb164c8b2e64ab61f10fd62be59dcabbdd4482d881"} err="failed to get container status \"5b4a28a8e78701e1f7847ddb164c8b2e64ab61f10fd62be59dcabbdd4482d881\": rpc error: code = NotFound desc = could not find container \"5b4a28a8e78701e1f7847ddb164c8b2e64ab61f10fd62be59dcabbdd4482d881\": container with ID starting with 5b4a28a8e78701e1f7847ddb164c8b2e64ab61f10fd62be59dcabbdd4482d881 not found: ID does not exist" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.738869 4563 scope.go:117] "RemoveContainer" containerID="267692549abc9379f2c9e4a5618e4727852839471e53bde19aeae1e70a4fb8ff" Nov 24 09:29:10 crc kubenswrapper[4563]: E1124 09:29:10.739172 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"267692549abc9379f2c9e4a5618e4727852839471e53bde19aeae1e70a4fb8ff\": container with ID starting with 267692549abc9379f2c9e4a5618e4727852839471e53bde19aeae1e70a4fb8ff not found: ID does not exist" containerID="267692549abc9379f2c9e4a5618e4727852839471e53bde19aeae1e70a4fb8ff" Nov 24 09:29:10 crc kubenswrapper[4563]: I1124 09:29:10.739212 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"267692549abc9379f2c9e4a5618e4727852839471e53bde19aeae1e70a4fb8ff"} err="failed to get container status \"267692549abc9379f2c9e4a5618e4727852839471e53bde19aeae1e70a4fb8ff\": rpc error: code = NotFound desc = could not find container \"267692549abc9379f2c9e4a5618e4727852839471e53bde19aeae1e70a4fb8ff\": container with ID starting with 267692549abc9379f2c9e4a5618e4727852839471e53bde19aeae1e70a4fb8ff not found: ID does not exist" Nov 24 09:29:11 crc kubenswrapper[4563]: I1124 09:29:11.039737 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-stk25"] Nov 24 09:29:11 crc kubenswrapper[4563]: I1124 09:29:11.046301 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-frjwb"] Nov 24 09:29:11 crc kubenswrapper[4563]: I1124 09:29:11.052966 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8klw7"] Nov 24 09:29:11 crc kubenswrapper[4563]: I1124 09:29:11.064162 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56d2060b-db38-4a2d-8a04-362d9ba093c3" path="/var/lib/kubelet/pods/56d2060b-db38-4a2d-8a04-362d9ba093c3/volumes" Nov 24 09:29:11 crc kubenswrapper[4563]: I1124 09:29:11.065016 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-stk25"] Nov 24 09:29:11 crc kubenswrapper[4563]: I1124 09:29:11.065269 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-frjwb"] Nov 24 09:29:11 crc kubenswrapper[4563]: I1124 09:29:11.069505 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8klw7"] Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.008192 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.178777 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-inventory\") pod \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\" (UID: \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\") " Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.179110 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-ssh-key\") pod \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\" (UID: \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\") " Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.179177 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjrgf\" (UniqueName: \"kubernetes.io/projected/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-kube-api-access-vjrgf\") pod \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\" (UID: \"424c4e83-a3c6-4eea-958e-e0cf83f20fdf\") " Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.189257 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-kube-api-access-vjrgf" (OuterVolumeSpecName: "kube-api-access-vjrgf") pod "424c4e83-a3c6-4eea-958e-e0cf83f20fdf" (UID: "424c4e83-a3c6-4eea-958e-e0cf83f20fdf"). InnerVolumeSpecName "kube-api-access-vjrgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.201151 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-inventory" (OuterVolumeSpecName: "inventory") pod "424c4e83-a3c6-4eea-958e-e0cf83f20fdf" (UID: "424c4e83-a3c6-4eea-958e-e0cf83f20fdf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.202785 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "424c4e83-a3c6-4eea-958e-e0cf83f20fdf" (UID: "424c4e83-a3c6-4eea-958e-e0cf83f20fdf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.282918 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.282948 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.282960 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjrgf\" (UniqueName: \"kubernetes.io/projected/424c4e83-a3c6-4eea-958e-e0cf83f20fdf-kube-api-access-vjrgf\") on node \"crc\" DevicePath \"\"" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.681391 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.681382 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4bvkr" event={"ID":"424c4e83-a3c6-4eea-958e-e0cf83f20fdf","Type":"ContainerDied","Data":"865dbb4d745f5f7b27d25b0446ea1402069d3d72a145cc220cd917551d9953ce"} Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.681888 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="865dbb4d745f5f7b27d25b0446ea1402069d3d72a145cc220cd917551d9953ce" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.729872 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz"] Nov 24 09:29:12 crc kubenswrapper[4563]: E1124 09:29:12.730301 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="424c4e83-a3c6-4eea-958e-e0cf83f20fdf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.730320 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="424c4e83-a3c6-4eea-958e-e0cf83f20fdf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 24 09:29:12 crc kubenswrapper[4563]: E1124 09:29:12.730354 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d2060b-db38-4a2d-8a04-362d9ba093c3" containerName="extract-utilities" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.730362 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d2060b-db38-4a2d-8a04-362d9ba093c3" containerName="extract-utilities" Nov 24 09:29:12 crc kubenswrapper[4563]: E1124 09:29:12.730381 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d2060b-db38-4a2d-8a04-362d9ba093c3" containerName="extract-content" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.730387 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d2060b-db38-4a2d-8a04-362d9ba093c3" containerName="extract-content" Nov 24 09:29:12 crc kubenswrapper[4563]: E1124 09:29:12.730399 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d2060b-db38-4a2d-8a04-362d9ba093c3" containerName="registry-server" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.730405 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d2060b-db38-4a2d-8a04-362d9ba093c3" containerName="registry-server" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.730564 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="424c4e83-a3c6-4eea-958e-e0cf83f20fdf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.730580 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="56d2060b-db38-4a2d-8a04-362d9ba093c3" containerName="registry-server" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.731287 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.732884 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.732979 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.735883 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz"] Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.736300 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.736514 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.776332 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kj2m6"] Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.792603 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db274\" (UniqueName: \"kubernetes.io/projected/dd51baae-5c71-4421-9cc1-1095c3bba2e9-kube-api-access-db274\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-64qvz\" (UID: \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.792697 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd51baae-5c71-4421-9cc1-1095c3bba2e9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-64qvz\" (UID: \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.792799 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd51baae-5c71-4421-9cc1-1095c3bba2e9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-64qvz\" (UID: \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.894676 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db274\" (UniqueName: \"kubernetes.io/projected/dd51baae-5c71-4421-9cc1-1095c3bba2e9-kube-api-access-db274\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-64qvz\" (UID: \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.894783 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd51baae-5c71-4421-9cc1-1095c3bba2e9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-64qvz\" (UID: \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.894899 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd51baae-5c71-4421-9cc1-1095c3bba2e9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-64qvz\" (UID: \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.900177 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd51baae-5c71-4421-9cc1-1095c3bba2e9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-64qvz\" (UID: \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.902050 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd51baae-5c71-4421-9cc1-1095c3bba2e9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-64qvz\" (UID: \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" Nov 24 09:29:12 crc kubenswrapper[4563]: I1124 09:29:12.908271 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db274\" (UniqueName: \"kubernetes.io/projected/dd51baae-5c71-4421-9cc1-1095c3bba2e9-kube-api-access-db274\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-64qvz\" (UID: \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" Nov 24 09:29:13 crc kubenswrapper[4563]: I1124 09:29:13.044785 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" Nov 24 09:29:13 crc kubenswrapper[4563]: I1124 09:29:13.063997 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0793e21-229f-415e-8b3e-1499e1ed3bf6" path="/var/lib/kubelet/pods/b0793e21-229f-415e-8b3e-1499e1ed3bf6/volumes" Nov 24 09:29:13 crc kubenswrapper[4563]: I1124 09:29:13.064922 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d803d6ca-646a-4dd5-93ef-d096b501c28a" path="/var/lib/kubelet/pods/d803d6ca-646a-4dd5-93ef-d096b501c28a/volumes" Nov 24 09:29:13 crc kubenswrapper[4563]: I1124 09:29:13.065480 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed99b33d-c985-44bf-9a4a-b9f93bf3927f" path="/var/lib/kubelet/pods/ed99b33d-c985-44bf-9a4a-b9f93bf3927f/volumes" Nov 24 09:29:13 crc kubenswrapper[4563]: I1124 09:29:13.488170 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz"] Nov 24 09:29:13 crc kubenswrapper[4563]: I1124 09:29:13.688626 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" event={"ID":"dd51baae-5c71-4421-9cc1-1095c3bba2e9","Type":"ContainerStarted","Data":"aff9426d55cc69bf286665e0a9672f4b88e8b82597e1d072aa09f63e9dba16de"} Nov 24 09:29:13 crc kubenswrapper[4563]: I1124 09:29:13.688787 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kj2m6" podUID="ee1e81c9-585e-4c08-b893-a6f74aa84138" containerName="registry-server" containerID="cri-o://9af3c51b53d26f3831db3effa2f91f537d899669b5dd7c3ff05c7395e79afcc8" gracePeriod=2 Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.049402 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.217579 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nrsj\" (UniqueName: \"kubernetes.io/projected/ee1e81c9-585e-4c08-b893-a6f74aa84138-kube-api-access-5nrsj\") pod \"ee1e81c9-585e-4c08-b893-a6f74aa84138\" (UID: \"ee1e81c9-585e-4c08-b893-a6f74aa84138\") " Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.217687 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee1e81c9-585e-4c08-b893-a6f74aa84138-catalog-content\") pod \"ee1e81c9-585e-4c08-b893-a6f74aa84138\" (UID: \"ee1e81c9-585e-4c08-b893-a6f74aa84138\") " Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.217723 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee1e81c9-585e-4c08-b893-a6f74aa84138-utilities\") pod \"ee1e81c9-585e-4c08-b893-a6f74aa84138\" (UID: \"ee1e81c9-585e-4c08-b893-a6f74aa84138\") " Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.218485 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee1e81c9-585e-4c08-b893-a6f74aa84138-utilities" (OuterVolumeSpecName: "utilities") pod "ee1e81c9-585e-4c08-b893-a6f74aa84138" (UID: "ee1e81c9-585e-4c08-b893-a6f74aa84138"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.220658 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1e81c9-585e-4c08-b893-a6f74aa84138-kube-api-access-5nrsj" (OuterVolumeSpecName: "kube-api-access-5nrsj") pod "ee1e81c9-585e-4c08-b893-a6f74aa84138" (UID: "ee1e81c9-585e-4c08-b893-a6f74aa84138"). InnerVolumeSpecName "kube-api-access-5nrsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.230201 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee1e81c9-585e-4c08-b893-a6f74aa84138-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee1e81c9-585e-4c08-b893-a6f74aa84138" (UID: "ee1e81c9-585e-4c08-b893-a6f74aa84138"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.319582 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee1e81c9-585e-4c08-b893-a6f74aa84138-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.319768 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee1e81c9-585e-4c08-b893-a6f74aa84138-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.319778 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nrsj\" (UniqueName: \"kubernetes.io/projected/ee1e81c9-585e-4c08-b893-a6f74aa84138-kube-api-access-5nrsj\") on node \"crc\" DevicePath \"\"" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.697948 4563 generic.go:334] "Generic (PLEG): container finished" podID="ee1e81c9-585e-4c08-b893-a6f74aa84138" containerID="9af3c51b53d26f3831db3effa2f91f537d899669b5dd7c3ff05c7395e79afcc8" exitCode=0 Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.698000 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kj2m6" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.698012 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kj2m6" event={"ID":"ee1e81c9-585e-4c08-b893-a6f74aa84138","Type":"ContainerDied","Data":"9af3c51b53d26f3831db3effa2f91f537d899669b5dd7c3ff05c7395e79afcc8"} Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.698045 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kj2m6" event={"ID":"ee1e81c9-585e-4c08-b893-a6f74aa84138","Type":"ContainerDied","Data":"67beceae60ec2ab05b28f4fb33bb790d73f730ab88436474db4ada5e7f4fc4b6"} Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.698067 4563 scope.go:117] "RemoveContainer" containerID="9af3c51b53d26f3831db3effa2f91f537d899669b5dd7c3ff05c7395e79afcc8" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.699372 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" event={"ID":"dd51baae-5c71-4421-9cc1-1095c3bba2e9","Type":"ContainerStarted","Data":"0c625dbc1b13ec19b4e385d1bab5b0987535cb0c92e0355e6c5084ed9ee7c661"} Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.714455 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" podStartSLOduration=2.050196408 podStartE2EDuration="2.714438624s" podCreationTimestamp="2025-11-24 09:29:12 +0000 UTC" firstStartedPulling="2025-11-24 09:29:13.49373775 +0000 UTC m=+1530.752715197" lastFinishedPulling="2025-11-24 09:29:14.157979965 +0000 UTC m=+1531.416957413" observedRunningTime="2025-11-24 09:29:14.711407889 +0000 UTC m=+1531.970385336" watchObservedRunningTime="2025-11-24 09:29:14.714438624 +0000 UTC m=+1531.973416071" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.725569 4563 scope.go:117] "RemoveContainer" containerID="7a7089f9c51e1f92aeaef539b16ab537ba8e9ab93bafbaed94b4feb902b408ab" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.733672 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kj2m6"] Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.739142 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kj2m6"] Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.743754 4563 scope.go:117] "RemoveContainer" containerID="db6aae2070efa72533f807a2abf3f6a8bd8f926c7a394be7b47ceead730cdfd5" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.756425 4563 scope.go:117] "RemoveContainer" containerID="9af3c51b53d26f3831db3effa2f91f537d899669b5dd7c3ff05c7395e79afcc8" Nov 24 09:29:14 crc kubenswrapper[4563]: E1124 09:29:14.756772 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9af3c51b53d26f3831db3effa2f91f537d899669b5dd7c3ff05c7395e79afcc8\": container with ID starting with 9af3c51b53d26f3831db3effa2f91f537d899669b5dd7c3ff05c7395e79afcc8 not found: ID does not exist" containerID="9af3c51b53d26f3831db3effa2f91f537d899669b5dd7c3ff05c7395e79afcc8" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.756802 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af3c51b53d26f3831db3effa2f91f537d899669b5dd7c3ff05c7395e79afcc8"} err="failed to get container status \"9af3c51b53d26f3831db3effa2f91f537d899669b5dd7c3ff05c7395e79afcc8\": rpc error: code = NotFound desc = could not find container \"9af3c51b53d26f3831db3effa2f91f537d899669b5dd7c3ff05c7395e79afcc8\": container with ID starting with 9af3c51b53d26f3831db3effa2f91f537d899669b5dd7c3ff05c7395e79afcc8 not found: ID does not exist" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.756821 4563 scope.go:117] "RemoveContainer" containerID="7a7089f9c51e1f92aeaef539b16ab537ba8e9ab93bafbaed94b4feb902b408ab" Nov 24 09:29:14 crc kubenswrapper[4563]: E1124 09:29:14.757141 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a7089f9c51e1f92aeaef539b16ab537ba8e9ab93bafbaed94b4feb902b408ab\": container with ID starting with 7a7089f9c51e1f92aeaef539b16ab537ba8e9ab93bafbaed94b4feb902b408ab not found: ID does not exist" containerID="7a7089f9c51e1f92aeaef539b16ab537ba8e9ab93bafbaed94b4feb902b408ab" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.757167 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a7089f9c51e1f92aeaef539b16ab537ba8e9ab93bafbaed94b4feb902b408ab"} err="failed to get container status \"7a7089f9c51e1f92aeaef539b16ab537ba8e9ab93bafbaed94b4feb902b408ab\": rpc error: code = NotFound desc = could not find container \"7a7089f9c51e1f92aeaef539b16ab537ba8e9ab93bafbaed94b4feb902b408ab\": container with ID starting with 7a7089f9c51e1f92aeaef539b16ab537ba8e9ab93bafbaed94b4feb902b408ab not found: ID does not exist" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.757179 4563 scope.go:117] "RemoveContainer" containerID="db6aae2070efa72533f807a2abf3f6a8bd8f926c7a394be7b47ceead730cdfd5" Nov 24 09:29:14 crc kubenswrapper[4563]: E1124 09:29:14.757451 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db6aae2070efa72533f807a2abf3f6a8bd8f926c7a394be7b47ceead730cdfd5\": container with ID starting with db6aae2070efa72533f807a2abf3f6a8bd8f926c7a394be7b47ceead730cdfd5 not found: ID does not exist" containerID="db6aae2070efa72533f807a2abf3f6a8bd8f926c7a394be7b47ceead730cdfd5" Nov 24 09:29:14 crc kubenswrapper[4563]: I1124 09:29:14.757470 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db6aae2070efa72533f807a2abf3f6a8bd8f926c7a394be7b47ceead730cdfd5"} err="failed to get container status \"db6aae2070efa72533f807a2abf3f6a8bd8f926c7a394be7b47ceead730cdfd5\": rpc error: code = NotFound desc = could not find container \"db6aae2070efa72533f807a2abf3f6a8bd8f926c7a394be7b47ceead730cdfd5\": container with ID starting with db6aae2070efa72533f807a2abf3f6a8bd8f926c7a394be7b47ceead730cdfd5 not found: ID does not exist" Nov 24 09:29:15 crc kubenswrapper[4563]: I1124 09:29:15.065277 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee1e81c9-585e-4c08-b893-a6f74aa84138" path="/var/lib/kubelet/pods/ee1e81c9-585e-4c08-b893-a6f74aa84138/volumes" Nov 24 09:29:46 crc kubenswrapper[4563]: I1124 09:29:46.033727 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-5dqsr"] Nov 24 09:29:46 crc kubenswrapper[4563]: I1124 09:29:46.042028 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-zzpdk"] Nov 24 09:29:46 crc kubenswrapper[4563]: I1124 09:29:46.047305 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-5dqsr"] Nov 24 09:29:46 crc kubenswrapper[4563]: I1124 09:29:46.052409 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0283-account-create-wklw4"] Nov 24 09:29:46 crc kubenswrapper[4563]: I1124 09:29:46.059567 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9a24-account-create-gdlvv"] Nov 24 09:29:46 crc kubenswrapper[4563]: I1124 09:29:46.064563 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-zzpdk"] Nov 24 09:29:46 crc kubenswrapper[4563]: I1124 09:29:46.069370 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9nwp9"] Nov 24 09:29:46 crc kubenswrapper[4563]: I1124 09:29:46.073393 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0283-account-create-wklw4"] Nov 24 09:29:46 crc kubenswrapper[4563]: I1124 09:29:46.078652 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9a24-account-create-gdlvv"] Nov 24 09:29:46 crc kubenswrapper[4563]: I1124 09:29:46.082777 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9nwp9"] Nov 24 09:29:46 crc kubenswrapper[4563]: I1124 09:29:46.086847 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-04d9-account-create-dm4rx"] Nov 24 09:29:46 crc kubenswrapper[4563]: I1124 09:29:46.090813 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-04d9-account-create-dm4rx"] Nov 24 09:29:47 crc kubenswrapper[4563]: I1124 09:29:47.071056 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cdd6b81-e677-43b6-a627-ac55f41bb1de" path="/var/lib/kubelet/pods/2cdd6b81-e677-43b6-a627-ac55f41bb1de/volumes" Nov 24 09:29:47 crc kubenswrapper[4563]: I1124 09:29:47.071746 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54187c3a-449a-420f-a04c-f4e6c6f7fc3f" path="/var/lib/kubelet/pods/54187c3a-449a-420f-a04c-f4e6c6f7fc3f/volumes" Nov 24 09:29:47 crc kubenswrapper[4563]: I1124 09:29:47.072348 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b426e4-d393-4260-b66f-28c288ce8e89" path="/var/lib/kubelet/pods/71b426e4-d393-4260-b66f-28c288ce8e89/volumes" Nov 24 09:29:47 crc kubenswrapper[4563]: I1124 09:29:47.073023 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b6f993-c4ff-4ea9-9306-5d30c09f0f8c" path="/var/lib/kubelet/pods/71b6f993-c4ff-4ea9-9306-5d30c09f0f8c/volumes" Nov 24 09:29:47 crc kubenswrapper[4563]: I1124 09:29:47.074159 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ee4dc6-69fd-4151-9e35-68fde68c500e" path="/var/lib/kubelet/pods/f0ee4dc6-69fd-4151-9e35-68fde68c500e/volumes" Nov 24 09:29:47 crc kubenswrapper[4563]: I1124 09:29:47.075589 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd2c44a4-86c6-43b2-9f19-ab409f2eaded" path="/var/lib/kubelet/pods/fd2c44a4-86c6-43b2-9f19-ab409f2eaded/volumes" Nov 24 09:29:48 crc kubenswrapper[4563]: I1124 09:29:48.972172 4563 generic.go:334] "Generic (PLEG): container finished" podID="dd51baae-5c71-4421-9cc1-1095c3bba2e9" containerID="0c625dbc1b13ec19b4e385d1bab5b0987535cb0c92e0355e6c5084ed9ee7c661" exitCode=0 Nov 24 09:29:48 crc kubenswrapper[4563]: I1124 09:29:48.972473 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" event={"ID":"dd51baae-5c71-4421-9cc1-1095c3bba2e9","Type":"ContainerDied","Data":"0c625dbc1b13ec19b4e385d1bab5b0987535cb0c92e0355e6c5084ed9ee7c661"} Nov 24 09:29:50 crc kubenswrapper[4563]: I1124 09:29:50.314356 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" Nov 24 09:29:50 crc kubenswrapper[4563]: I1124 09:29:50.436519 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd51baae-5c71-4421-9cc1-1095c3bba2e9-inventory\") pod \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\" (UID: \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\") " Nov 24 09:29:50 crc kubenswrapper[4563]: I1124 09:29:50.436718 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db274\" (UniqueName: \"kubernetes.io/projected/dd51baae-5c71-4421-9cc1-1095c3bba2e9-kube-api-access-db274\") pod \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\" (UID: \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\") " Nov 24 09:29:50 crc kubenswrapper[4563]: I1124 09:29:50.436831 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd51baae-5c71-4421-9cc1-1095c3bba2e9-ssh-key\") pod \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\" (UID: \"dd51baae-5c71-4421-9cc1-1095c3bba2e9\") " Nov 24 09:29:50 crc kubenswrapper[4563]: I1124 09:29:50.442754 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd51baae-5c71-4421-9cc1-1095c3bba2e9-kube-api-access-db274" (OuterVolumeSpecName: "kube-api-access-db274") pod "dd51baae-5c71-4421-9cc1-1095c3bba2e9" (UID: "dd51baae-5c71-4421-9cc1-1095c3bba2e9"). InnerVolumeSpecName "kube-api-access-db274". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:29:50 crc kubenswrapper[4563]: I1124 09:29:50.463906 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd51baae-5c71-4421-9cc1-1095c3bba2e9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dd51baae-5c71-4421-9cc1-1095c3bba2e9" (UID: "dd51baae-5c71-4421-9cc1-1095c3bba2e9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:29:50 crc kubenswrapper[4563]: I1124 09:29:50.464005 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd51baae-5c71-4421-9cc1-1095c3bba2e9-inventory" (OuterVolumeSpecName: "inventory") pod "dd51baae-5c71-4421-9cc1-1095c3bba2e9" (UID: "dd51baae-5c71-4421-9cc1-1095c3bba2e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:29:50 crc kubenswrapper[4563]: I1124 09:29:50.541732 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db274\" (UniqueName: \"kubernetes.io/projected/dd51baae-5c71-4421-9cc1-1095c3bba2e9-kube-api-access-db274\") on node \"crc\" DevicePath \"\"" Nov 24 09:29:50 crc kubenswrapper[4563]: I1124 09:29:50.542030 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd51baae-5c71-4421-9cc1-1095c3bba2e9-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:29:50 crc kubenswrapper[4563]: I1124 09:29:50.542044 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd51baae-5c71-4421-9cc1-1095c3bba2e9-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:29:50 crc kubenswrapper[4563]: I1124 09:29:50.992055 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" event={"ID":"dd51baae-5c71-4421-9cc1-1095c3bba2e9","Type":"ContainerDied","Data":"aff9426d55cc69bf286665e0a9672f4b88e8b82597e1d072aa09f63e9dba16de"} Nov 24 09:29:50 crc kubenswrapper[4563]: I1124 09:29:50.992101 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aff9426d55cc69bf286665e0a9672f4b88e8b82597e1d072aa09f63e9dba16de" Nov 24 09:29:50 crc kubenswrapper[4563]: I1124 09:29:50.992181 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-64qvz" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.064128 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4lklp"] Nov 24 09:29:51 crc kubenswrapper[4563]: E1124 09:29:51.064447 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1e81c9-585e-4c08-b893-a6f74aa84138" containerName="extract-content" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.064465 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1e81c9-585e-4c08-b893-a6f74aa84138" containerName="extract-content" Nov 24 09:29:51 crc kubenswrapper[4563]: E1124 09:29:51.064480 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd51baae-5c71-4421-9cc1-1095c3bba2e9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.064487 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd51baae-5c71-4421-9cc1-1095c3bba2e9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 24 09:29:51 crc kubenswrapper[4563]: E1124 09:29:51.064511 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1e81c9-585e-4c08-b893-a6f74aa84138" containerName="registry-server" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.064517 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1e81c9-585e-4c08-b893-a6f74aa84138" containerName="registry-server" Nov 24 09:29:51 crc kubenswrapper[4563]: E1124 09:29:51.064530 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1e81c9-585e-4c08-b893-a6f74aa84138" containerName="extract-utilities" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.064536 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1e81c9-585e-4c08-b893-a6f74aa84138" containerName="extract-utilities" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.064712 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1e81c9-585e-4c08-b893-a6f74aa84138" containerName="registry-server" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.064731 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd51baae-5c71-4421-9cc1-1095c3bba2e9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.065626 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.067313 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.067475 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.067564 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.068065 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.071672 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4lklp"] Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.152712 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/77470e38-d989-4832-8cac-4b2f1a8f2d14-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4lklp\" (UID: \"77470e38-d989-4832-8cac-4b2f1a8f2d14\") " pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.153147 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77470e38-d989-4832-8cac-4b2f1a8f2d14-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4lklp\" (UID: \"77470e38-d989-4832-8cac-4b2f1a8f2d14\") " pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.153328 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp8rw\" (UniqueName: \"kubernetes.io/projected/77470e38-d989-4832-8cac-4b2f1a8f2d14-kube-api-access-fp8rw\") pod \"ssh-known-hosts-edpm-deployment-4lklp\" (UID: \"77470e38-d989-4832-8cac-4b2f1a8f2d14\") " pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.255499 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/77470e38-d989-4832-8cac-4b2f1a8f2d14-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4lklp\" (UID: \"77470e38-d989-4832-8cac-4b2f1a8f2d14\") " pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.255744 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77470e38-d989-4832-8cac-4b2f1a8f2d14-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4lklp\" (UID: \"77470e38-d989-4832-8cac-4b2f1a8f2d14\") " pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.255822 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp8rw\" (UniqueName: \"kubernetes.io/projected/77470e38-d989-4832-8cac-4b2f1a8f2d14-kube-api-access-fp8rw\") pod \"ssh-known-hosts-edpm-deployment-4lklp\" (UID: \"77470e38-d989-4832-8cac-4b2f1a8f2d14\") " pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.284519 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp8rw\" (UniqueName: \"kubernetes.io/projected/77470e38-d989-4832-8cac-4b2f1a8f2d14-kube-api-access-fp8rw\") pod \"ssh-known-hosts-edpm-deployment-4lklp\" (UID: \"77470e38-d989-4832-8cac-4b2f1a8f2d14\") " pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.285243 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77470e38-d989-4832-8cac-4b2f1a8f2d14-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4lklp\" (UID: \"77470e38-d989-4832-8cac-4b2f1a8f2d14\") " pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.290622 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/77470e38-d989-4832-8cac-4b2f1a8f2d14-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4lklp\" (UID: \"77470e38-d989-4832-8cac-4b2f1a8f2d14\") " pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.381425 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.847734 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4lklp"] Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.886065 4563 scope.go:117] "RemoveContainer" containerID="a75fb20e0f9fb563e540d2e760669d37d93203806b15615e19c69e43dc7709ed" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.912708 4563 scope.go:117] "RemoveContainer" containerID="cb121a91c0552ff81f840885d3fb56bbadff92e9ebc7c462dda9fab6858b5336" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.929509 4563 scope.go:117] "RemoveContainer" containerID="3c26adbc8059ff965ca66c7fa73df912c4211f8febaa28943e3cffd7423ac3c5" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.947583 4563 scope.go:117] "RemoveContainer" containerID="484888d9344f9856ecfdccaba8884e59ceabedfdbe8d25da5cb2812646da6652" Nov 24 09:29:51 crc kubenswrapper[4563]: I1124 09:29:51.982427 4563 scope.go:117] "RemoveContainer" containerID="1e68b8254c01f5945197761aa93a0a3825f41da8f8a75fadc6d0a90987706bf4" Nov 24 09:29:52 crc kubenswrapper[4563]: I1124 09:29:52.005340 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" event={"ID":"77470e38-d989-4832-8cac-4b2f1a8f2d14","Type":"ContainerStarted","Data":"ffc6de4eb65a83c2a958b9c4c2a1f6e54454f2703f629b594d8a695fa824dcd3"} Nov 24 09:29:52 crc kubenswrapper[4563]: I1124 09:29:52.006276 4563 scope.go:117] "RemoveContainer" containerID="fdf38bf1b96efef1bc076e890ef650066b9626aa1431bc638ade4e93c4df030b" Nov 24 09:29:52 crc kubenswrapper[4563]: I1124 09:29:52.030570 4563 scope.go:117] "RemoveContainer" containerID="6f2af6eac37517ee8e9aee60a7627d32c3557a461b9321cebbc2c5326f4e2fa2" Nov 24 09:29:52 crc kubenswrapper[4563]: I1124 09:29:52.048327 4563 scope.go:117] "RemoveContainer" containerID="7bd613c1e631f06f8c99dc293308cdd878b574255f51170c4e79e7db1a5cba3f" Nov 24 09:29:52 crc kubenswrapper[4563]: I1124 09:29:52.062789 4563 scope.go:117] "RemoveContainer" containerID="c95204df4565a0ad14fee7b6fccca9aea380e3d54a7217bffee4ab5f68755207" Nov 24 09:29:52 crc kubenswrapper[4563]: I1124 09:29:52.079949 4563 scope.go:117] "RemoveContainer" containerID="ee740e0fa40e50447767b410027ff504c58fc3846d856cd5523dc356da7c8e76" Nov 24 09:29:53 crc kubenswrapper[4563]: I1124 09:29:53.023889 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" event={"ID":"77470e38-d989-4832-8cac-4b2f1a8f2d14","Type":"ContainerStarted","Data":"759be60136c8e2ea84a6a67041d79b07062b007848119eac48eda401d2b8a9b8"} Nov 24 09:29:53 crc kubenswrapper[4563]: I1124 09:29:53.038543 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" podStartSLOduration=1.546321702 podStartE2EDuration="2.038529893s" podCreationTimestamp="2025-11-24 09:29:51 +0000 UTC" firstStartedPulling="2025-11-24 09:29:51.853630877 +0000 UTC m=+1569.112608324" lastFinishedPulling="2025-11-24 09:29:52.345839067 +0000 UTC m=+1569.604816515" observedRunningTime="2025-11-24 09:29:53.033559259 +0000 UTC m=+1570.292536706" watchObservedRunningTime="2025-11-24 09:29:53.038529893 +0000 UTC m=+1570.297507330" Nov 24 09:29:58 crc kubenswrapper[4563]: I1124 09:29:58.070943 4563 generic.go:334] "Generic (PLEG): container finished" podID="77470e38-d989-4832-8cac-4b2f1a8f2d14" containerID="759be60136c8e2ea84a6a67041d79b07062b007848119eac48eda401d2b8a9b8" exitCode=0 Nov 24 09:29:58 crc kubenswrapper[4563]: I1124 09:29:58.071050 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" event={"ID":"77470e38-d989-4832-8cac-4b2f1a8f2d14","Type":"ContainerDied","Data":"759be60136c8e2ea84a6a67041d79b07062b007848119eac48eda401d2b8a9b8"} Nov 24 09:29:59 crc kubenswrapper[4563]: I1124 09:29:59.484457 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" Nov 24 09:29:59 crc kubenswrapper[4563]: I1124 09:29:59.627476 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77470e38-d989-4832-8cac-4b2f1a8f2d14-ssh-key-openstack-edpm-ipam\") pod \"77470e38-d989-4832-8cac-4b2f1a8f2d14\" (UID: \"77470e38-d989-4832-8cac-4b2f1a8f2d14\") " Nov 24 09:29:59 crc kubenswrapper[4563]: I1124 09:29:59.627814 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp8rw\" (UniqueName: \"kubernetes.io/projected/77470e38-d989-4832-8cac-4b2f1a8f2d14-kube-api-access-fp8rw\") pod \"77470e38-d989-4832-8cac-4b2f1a8f2d14\" (UID: \"77470e38-d989-4832-8cac-4b2f1a8f2d14\") " Nov 24 09:29:59 crc kubenswrapper[4563]: I1124 09:29:59.627944 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/77470e38-d989-4832-8cac-4b2f1a8f2d14-inventory-0\") pod \"77470e38-d989-4832-8cac-4b2f1a8f2d14\" (UID: \"77470e38-d989-4832-8cac-4b2f1a8f2d14\") " Nov 24 09:29:59 crc kubenswrapper[4563]: I1124 09:29:59.633985 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77470e38-d989-4832-8cac-4b2f1a8f2d14-kube-api-access-fp8rw" (OuterVolumeSpecName: "kube-api-access-fp8rw") pod "77470e38-d989-4832-8cac-4b2f1a8f2d14" (UID: "77470e38-d989-4832-8cac-4b2f1a8f2d14"). InnerVolumeSpecName "kube-api-access-fp8rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:29:59 crc kubenswrapper[4563]: I1124 09:29:59.652559 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77470e38-d989-4832-8cac-4b2f1a8f2d14-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "77470e38-d989-4832-8cac-4b2f1a8f2d14" (UID: "77470e38-d989-4832-8cac-4b2f1a8f2d14"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:29:59 crc kubenswrapper[4563]: I1124 09:29:59.655258 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77470e38-d989-4832-8cac-4b2f1a8f2d14-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "77470e38-d989-4832-8cac-4b2f1a8f2d14" (UID: "77470e38-d989-4832-8cac-4b2f1a8f2d14"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:29:59 crc kubenswrapper[4563]: I1124 09:29:59.730191 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77470e38-d989-4832-8cac-4b2f1a8f2d14-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 24 09:29:59 crc kubenswrapper[4563]: I1124 09:29:59.730219 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp8rw\" (UniqueName: \"kubernetes.io/projected/77470e38-d989-4832-8cac-4b2f1a8f2d14-kube-api-access-fp8rw\") on node \"crc\" DevicePath \"\"" Nov 24 09:29:59 crc kubenswrapper[4563]: I1124 09:29:59.730231 4563 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/77470e38-d989-4832-8cac-4b2f1a8f2d14-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.098249 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" event={"ID":"77470e38-d989-4832-8cac-4b2f1a8f2d14","Type":"ContainerDied","Data":"ffc6de4eb65a83c2a958b9c4c2a1f6e54454f2703f629b594d8a695fa824dcd3"} Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.098297 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffc6de4eb65a83c2a958b9c4c2a1f6e54454f2703f629b594d8a695fa824dcd3" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.098411 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4lklp" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.146390 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t"] Nov 24 09:30:00 crc kubenswrapper[4563]: E1124 09:30:00.147024 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77470e38-d989-4832-8cac-4b2f1a8f2d14" containerName="ssh-known-hosts-edpm-deployment" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.147043 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="77470e38-d989-4832-8cac-4b2f1a8f2d14" containerName="ssh-known-hosts-edpm-deployment" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.147346 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="77470e38-d989-4832-8cac-4b2f1a8f2d14" containerName="ssh-known-hosts-edpm-deployment" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.147961 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.150164 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.151508 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.153736 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m"] Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.154858 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.156759 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.156786 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.156980 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.162216 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.175737 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m"] Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.192486 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t"] Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.341876 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-secret-volume\") pod \"collect-profiles-29399610-bv55t\" (UID: \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.341984 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-config-volume\") pod \"collect-profiles-29399610-bv55t\" (UID: \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.342017 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f2b4785-aae5-4031-9e66-c3601ef67b6a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mbq9m\" (UID: \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.342067 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f2b4785-aae5-4031-9e66-c3601ef67b6a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mbq9m\" (UID: \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.342298 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b47ww\" (UniqueName: \"kubernetes.io/projected/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-kube-api-access-b47ww\") pod \"collect-profiles-29399610-bv55t\" (UID: \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.342353 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsqck\" (UniqueName: \"kubernetes.io/projected/5f2b4785-aae5-4031-9e66-c3601ef67b6a-kube-api-access-vsqck\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mbq9m\" (UID: \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.444051 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b47ww\" (UniqueName: \"kubernetes.io/projected/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-kube-api-access-b47ww\") pod \"collect-profiles-29399610-bv55t\" (UID: \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.444103 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsqck\" (UniqueName: \"kubernetes.io/projected/5f2b4785-aae5-4031-9e66-c3601ef67b6a-kube-api-access-vsqck\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mbq9m\" (UID: \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.444207 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-secret-volume\") pod \"collect-profiles-29399610-bv55t\" (UID: \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.444293 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-config-volume\") pod \"collect-profiles-29399610-bv55t\" (UID: \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.444329 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f2b4785-aae5-4031-9e66-c3601ef67b6a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mbq9m\" (UID: \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.444392 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f2b4785-aae5-4031-9e66-c3601ef67b6a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mbq9m\" (UID: \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.445325 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-config-volume\") pod \"collect-profiles-29399610-bv55t\" (UID: \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.449586 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f2b4785-aae5-4031-9e66-c3601ef67b6a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mbq9m\" (UID: \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.450069 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f2b4785-aae5-4031-9e66-c3601ef67b6a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mbq9m\" (UID: \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.450355 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-secret-volume\") pod \"collect-profiles-29399610-bv55t\" (UID: \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.459220 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b47ww\" (UniqueName: \"kubernetes.io/projected/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-kube-api-access-b47ww\") pod \"collect-profiles-29399610-bv55t\" (UID: \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.462057 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsqck\" (UniqueName: \"kubernetes.io/projected/5f2b4785-aae5-4031-9e66-c3601ef67b6a-kube-api-access-vsqck\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mbq9m\" (UID: \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.469521 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.474182 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.928655 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t"] Nov 24 09:30:00 crc kubenswrapper[4563]: I1124 09:30:00.960704 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m"] Nov 24 09:30:00 crc kubenswrapper[4563]: W1124 09:30:00.967410 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f2b4785_aae5_4031_9e66_c3601ef67b6a.slice/crio-ec2104e81fe16f77fa8c663d5269050f83717b0ffc3b45d99ce2cbd106ba9a53 WatchSource:0}: Error finding container ec2104e81fe16f77fa8c663d5269050f83717b0ffc3b45d99ce2cbd106ba9a53: Status 404 returned error can't find the container with id ec2104e81fe16f77fa8c663d5269050f83717b0ffc3b45d99ce2cbd106ba9a53 Nov 24 09:30:01 crc kubenswrapper[4563]: I1124 09:30:01.112933 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" event={"ID":"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa","Type":"ContainerStarted","Data":"af502d59b32f073881edb225f743754250d0daed527161f271e263a95066c993"} Nov 24 09:30:01 crc kubenswrapper[4563]: I1124 09:30:01.113395 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" event={"ID":"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa","Type":"ContainerStarted","Data":"b764609091417a333ebe17c7bcef841bf5142ecb71108eef5a2c6b9df30b1ded"} Nov 24 09:30:01 crc kubenswrapper[4563]: I1124 09:30:01.115058 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" event={"ID":"5f2b4785-aae5-4031-9e66-c3601ef67b6a","Type":"ContainerStarted","Data":"ec2104e81fe16f77fa8c663d5269050f83717b0ffc3b45d99ce2cbd106ba9a53"} Nov 24 09:30:01 crc kubenswrapper[4563]: I1124 09:30:01.131658 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" podStartSLOduration=1.131627024 podStartE2EDuration="1.131627024s" podCreationTimestamp="2025-11-24 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:30:01.12597727 +0000 UTC m=+1578.384954717" watchObservedRunningTime="2025-11-24 09:30:01.131627024 +0000 UTC m=+1578.390604472" Nov 24 09:30:02 crc kubenswrapper[4563]: I1124 09:30:02.124561 4563 generic.go:334] "Generic (PLEG): container finished" podID="1c8408d5-8f15-4067-b9ae-9b9640f8cdfa" containerID="af502d59b32f073881edb225f743754250d0daed527161f271e263a95066c993" exitCode=0 Nov 24 09:30:02 crc kubenswrapper[4563]: I1124 09:30:02.125349 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" event={"ID":"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa","Type":"ContainerDied","Data":"af502d59b32f073881edb225f743754250d0daed527161f271e263a95066c993"} Nov 24 09:30:02 crc kubenswrapper[4563]: I1124 09:30:02.126853 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" event={"ID":"5f2b4785-aae5-4031-9e66-c3601ef67b6a","Type":"ContainerStarted","Data":"1aed6ac7387a512926befd467fd695de6132927a5e259c6e5f2274a1d21d5127"} Nov 24 09:30:02 crc kubenswrapper[4563]: I1124 09:30:02.167412 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" podStartSLOduration=1.7054070989999999 podStartE2EDuration="2.167383378s" podCreationTimestamp="2025-11-24 09:30:00 +0000 UTC" firstStartedPulling="2025-11-24 09:30:00.97220239 +0000 UTC m=+1578.231179827" lastFinishedPulling="2025-11-24 09:30:01.434178668 +0000 UTC m=+1578.693156106" observedRunningTime="2025-11-24 09:30:02.155237051 +0000 UTC m=+1579.414214498" watchObservedRunningTime="2025-11-24 09:30:02.167383378 +0000 UTC m=+1579.426360824" Nov 24 09:30:03 crc kubenswrapper[4563]: I1124 09:30:03.378875 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" Nov 24 09:30:03 crc kubenswrapper[4563]: I1124 09:30:03.409622 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b47ww\" (UniqueName: \"kubernetes.io/projected/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-kube-api-access-b47ww\") pod \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\" (UID: \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\") " Nov 24 09:30:03 crc kubenswrapper[4563]: I1124 09:30:03.409736 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-config-volume\") pod \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\" (UID: \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\") " Nov 24 09:30:03 crc kubenswrapper[4563]: I1124 09:30:03.409854 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-secret-volume\") pod \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\" (UID: \"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa\") " Nov 24 09:30:03 crc kubenswrapper[4563]: I1124 09:30:03.410753 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-config-volume" (OuterVolumeSpecName: "config-volume") pod "1c8408d5-8f15-4067-b9ae-9b9640f8cdfa" (UID: "1c8408d5-8f15-4067-b9ae-9b9640f8cdfa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:30:03 crc kubenswrapper[4563]: I1124 09:30:03.416079 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-kube-api-access-b47ww" (OuterVolumeSpecName: "kube-api-access-b47ww") pod "1c8408d5-8f15-4067-b9ae-9b9640f8cdfa" (UID: "1c8408d5-8f15-4067-b9ae-9b9640f8cdfa"). InnerVolumeSpecName "kube-api-access-b47ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:30:03 crc kubenswrapper[4563]: I1124 09:30:03.416699 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1c8408d5-8f15-4067-b9ae-9b9640f8cdfa" (UID: "1c8408d5-8f15-4067-b9ae-9b9640f8cdfa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:30:03 crc kubenswrapper[4563]: I1124 09:30:03.511865 4563 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:03 crc kubenswrapper[4563]: I1124 09:30:03.511904 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b47ww\" (UniqueName: \"kubernetes.io/projected/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-kube-api-access-b47ww\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:03 crc kubenswrapper[4563]: I1124 09:30:03.511915 4563 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c8408d5-8f15-4067-b9ae-9b9640f8cdfa-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:04 crc kubenswrapper[4563]: I1124 09:30:04.149976 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" event={"ID":"1c8408d5-8f15-4067-b9ae-9b9640f8cdfa","Type":"ContainerDied","Data":"b764609091417a333ebe17c7bcef841bf5142ecb71108eef5a2c6b9df30b1ded"} Nov 24 09:30:04 crc kubenswrapper[4563]: I1124 09:30:04.150020 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b764609091417a333ebe17c7bcef841bf5142ecb71108eef5a2c6b9df30b1ded" Nov 24 09:30:04 crc kubenswrapper[4563]: I1124 09:30:04.149992 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399610-bv55t" Nov 24 09:30:06 crc kubenswrapper[4563]: I1124 09:30:06.037613 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rgwhs"] Nov 24 09:30:06 crc kubenswrapper[4563]: I1124 09:30:06.044431 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rgwhs"] Nov 24 09:30:07 crc kubenswrapper[4563]: I1124 09:30:07.065921 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c326972-9b5f-4f8c-b71d-6811d65b31e1" path="/var/lib/kubelet/pods/8c326972-9b5f-4f8c-b71d-6811d65b31e1/volumes" Nov 24 09:30:08 crc kubenswrapper[4563]: I1124 09:30:08.195080 4563 generic.go:334] "Generic (PLEG): container finished" podID="5f2b4785-aae5-4031-9e66-c3601ef67b6a" containerID="1aed6ac7387a512926befd467fd695de6132927a5e259c6e5f2274a1d21d5127" exitCode=0 Nov 24 09:30:08 crc kubenswrapper[4563]: I1124 09:30:08.195174 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" event={"ID":"5f2b4785-aae5-4031-9e66-c3601ef67b6a","Type":"ContainerDied","Data":"1aed6ac7387a512926befd467fd695de6132927a5e259c6e5f2274a1d21d5127"} Nov 24 09:30:09 crc kubenswrapper[4563]: I1124 09:30:09.537809 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" Nov 24 09:30:09 crc kubenswrapper[4563]: I1124 09:30:09.642869 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsqck\" (UniqueName: \"kubernetes.io/projected/5f2b4785-aae5-4031-9e66-c3601ef67b6a-kube-api-access-vsqck\") pod \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\" (UID: \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\") " Nov 24 09:30:09 crc kubenswrapper[4563]: I1124 09:30:09.643360 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f2b4785-aae5-4031-9e66-c3601ef67b6a-inventory\") pod \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\" (UID: \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\") " Nov 24 09:30:09 crc kubenswrapper[4563]: I1124 09:30:09.643557 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f2b4785-aae5-4031-9e66-c3601ef67b6a-ssh-key\") pod \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\" (UID: \"5f2b4785-aae5-4031-9e66-c3601ef67b6a\") " Nov 24 09:30:09 crc kubenswrapper[4563]: I1124 09:30:09.649776 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f2b4785-aae5-4031-9e66-c3601ef67b6a-kube-api-access-vsqck" (OuterVolumeSpecName: "kube-api-access-vsqck") pod "5f2b4785-aae5-4031-9e66-c3601ef67b6a" (UID: "5f2b4785-aae5-4031-9e66-c3601ef67b6a"). InnerVolumeSpecName "kube-api-access-vsqck". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:30:09 crc kubenswrapper[4563]: I1124 09:30:09.666999 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2b4785-aae5-4031-9e66-c3601ef67b6a-inventory" (OuterVolumeSpecName: "inventory") pod "5f2b4785-aae5-4031-9e66-c3601ef67b6a" (UID: "5f2b4785-aae5-4031-9e66-c3601ef67b6a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:30:09 crc kubenswrapper[4563]: I1124 09:30:09.668441 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f2b4785-aae5-4031-9e66-c3601ef67b6a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5f2b4785-aae5-4031-9e66-c3601ef67b6a" (UID: "5f2b4785-aae5-4031-9e66-c3601ef67b6a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:30:09 crc kubenswrapper[4563]: I1124 09:30:09.746757 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f2b4785-aae5-4031-9e66-c3601ef67b6a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:09 crc kubenswrapper[4563]: I1124 09:30:09.746782 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsqck\" (UniqueName: \"kubernetes.io/projected/5f2b4785-aae5-4031-9e66-c3601ef67b6a-kube-api-access-vsqck\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:09 crc kubenswrapper[4563]: I1124 09:30:09.746794 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f2b4785-aae5-4031-9e66-c3601ef67b6a-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.214149 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" event={"ID":"5f2b4785-aae5-4031-9e66-c3601ef67b6a","Type":"ContainerDied","Data":"ec2104e81fe16f77fa8c663d5269050f83717b0ffc3b45d99ce2cbd106ba9a53"} Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.214193 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec2104e81fe16f77fa8c663d5269050f83717b0ffc3b45d99ce2cbd106ba9a53" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.214279 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mbq9m" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.268088 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw"] Nov 24 09:30:10 crc kubenswrapper[4563]: E1124 09:30:10.268460 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8408d5-8f15-4067-b9ae-9b9640f8cdfa" containerName="collect-profiles" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.268476 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8408d5-8f15-4067-b9ae-9b9640f8cdfa" containerName="collect-profiles" Nov 24 09:30:10 crc kubenswrapper[4563]: E1124 09:30:10.268496 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2b4785-aae5-4031-9e66-c3601ef67b6a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.268503 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2b4785-aae5-4031-9e66-c3601ef67b6a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.268713 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c8408d5-8f15-4067-b9ae-9b9640f8cdfa" containerName="collect-profiles" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.268743 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2b4785-aae5-4031-9e66-c3601ef67b6a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.269323 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.271413 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.271576 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.272009 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.272177 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.276681 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw"] Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.360653 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw\" (UID: \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.360786 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw\" (UID: \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.361064 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxnz5\" (UniqueName: \"kubernetes.io/projected/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-kube-api-access-bxnz5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw\" (UID: \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.462459 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxnz5\" (UniqueName: \"kubernetes.io/projected/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-kube-api-access-bxnz5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw\" (UID: \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.462564 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw\" (UID: \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.462612 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw\" (UID: \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.467203 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw\" (UID: \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.467491 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw\" (UID: \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.476442 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxnz5\" (UniqueName: \"kubernetes.io/projected/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-kube-api-access-bxnz5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw\" (UID: \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" Nov 24 09:30:10 crc kubenswrapper[4563]: I1124 09:30:10.581360 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" Nov 24 09:30:11 crc kubenswrapper[4563]: I1124 09:30:11.033564 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw"] Nov 24 09:30:11 crc kubenswrapper[4563]: I1124 09:30:11.225431 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" event={"ID":"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0","Type":"ContainerStarted","Data":"338e698302112ebd0db1096506612371dc8342e49e05e7629d89614654acdb37"} Nov 24 09:30:12 crc kubenswrapper[4563]: I1124 09:30:12.233632 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" event={"ID":"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0","Type":"ContainerStarted","Data":"8472da902d88441a11895d0946e47a7b964904433bebcce41ee4eb302f345930"} Nov 24 09:30:12 crc kubenswrapper[4563]: I1124 09:30:12.255295 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" podStartSLOduration=1.7504425449999998 podStartE2EDuration="2.255281045s" podCreationTimestamp="2025-11-24 09:30:10 +0000 UTC" firstStartedPulling="2025-11-24 09:30:11.038675133 +0000 UTC m=+1588.297652581" lastFinishedPulling="2025-11-24 09:30:11.543513643 +0000 UTC m=+1588.802491081" observedRunningTime="2025-11-24 09:30:12.245678575 +0000 UTC m=+1589.504656012" watchObservedRunningTime="2025-11-24 09:30:12.255281045 +0000 UTC m=+1589.514258491" Nov 24 09:30:19 crc kubenswrapper[4563]: I1124 09:30:19.295151 4563 generic.go:334] "Generic (PLEG): container finished" podID="bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0" containerID="8472da902d88441a11895d0946e47a7b964904433bebcce41ee4eb302f345930" exitCode=0 Nov 24 09:30:19 crc kubenswrapper[4563]: I1124 09:30:19.295181 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" event={"ID":"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0","Type":"ContainerDied","Data":"8472da902d88441a11895d0946e47a7b964904433bebcce41ee4eb302f345930"} Nov 24 09:30:20 crc kubenswrapper[4563]: I1124 09:30:20.626381 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" Nov 24 09:30:20 crc kubenswrapper[4563]: I1124 09:30:20.748128 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-ssh-key\") pod \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\" (UID: \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\") " Nov 24 09:30:20 crc kubenswrapper[4563]: I1124 09:30:20.748195 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxnz5\" (UniqueName: \"kubernetes.io/projected/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-kube-api-access-bxnz5\") pod \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\" (UID: \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\") " Nov 24 09:30:20 crc kubenswrapper[4563]: I1124 09:30:20.748332 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-inventory\") pod \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\" (UID: \"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0\") " Nov 24 09:30:20 crc kubenswrapper[4563]: I1124 09:30:20.752654 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-kube-api-access-bxnz5" (OuterVolumeSpecName: "kube-api-access-bxnz5") pod "bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0" (UID: "bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0"). InnerVolumeSpecName "kube-api-access-bxnz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:30:20 crc kubenswrapper[4563]: I1124 09:30:20.767941 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0" (UID: "bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:30:20 crc kubenswrapper[4563]: I1124 09:30:20.769702 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-inventory" (OuterVolumeSpecName: "inventory") pod "bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0" (UID: "bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:30:20 crc kubenswrapper[4563]: I1124 09:30:20.850633 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:20 crc kubenswrapper[4563]: I1124 09:30:20.850961 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:20 crc kubenswrapper[4563]: I1124 09:30:20.850972 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxnz5\" (UniqueName: \"kubernetes.io/projected/bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0-kube-api-access-bxnz5\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.312825 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" event={"ID":"bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0","Type":"ContainerDied","Data":"338e698302112ebd0db1096506612371dc8342e49e05e7629d89614654acdb37"} Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.312887 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="338e698302112ebd0db1096506612371dc8342e49e05e7629d89614654acdb37" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.312890 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.368181 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr"] Nov 24 09:30:21 crc kubenswrapper[4563]: E1124 09:30:21.368683 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.368701 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.368888 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.369672 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.372086 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.372437 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.372817 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.372906 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.372917 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.373110 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.373687 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.374088 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.384014 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr"] Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.566067 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.566116 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.566153 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.566174 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.566282 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.566407 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.566488 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.566603 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.566735 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xrjb\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-kube-api-access-7xrjb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.566767 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.566861 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.566904 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.566924 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.567008 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.668357 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.668402 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.668424 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.668442 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.668468 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.668493 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.668518 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.668549 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.668585 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xrjb\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-kube-api-access-7xrjb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.668603 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.668629 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.668675 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.668691 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.668716 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.674074 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.674226 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.674362 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.674650 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.674781 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.674944 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.675119 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.675436 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.675908 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.675918 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.675924 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.676453 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.676546 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.682875 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xrjb\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-kube-api-access-7xrjb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:21 crc kubenswrapper[4563]: I1124 09:30:21.982885 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:22 crc kubenswrapper[4563]: I1124 09:30:22.425282 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr"] Nov 24 09:30:22 crc kubenswrapper[4563]: W1124 09:30:22.431465 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa741fe2_400c_479c_bfb3_0d5273b064e2.slice/crio-108add6df860766345c425aed4b50fa3e025464357bd495ca953a82b899e1e71 WatchSource:0}: Error finding container 108add6df860766345c425aed4b50fa3e025464357bd495ca953a82b899e1e71: Status 404 returned error can't find the container with id 108add6df860766345c425aed4b50fa3e025464357bd495ca953a82b899e1e71 Nov 24 09:30:23 crc kubenswrapper[4563]: I1124 09:30:23.332728 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" event={"ID":"aa741fe2-400c-479c-bfb3-0d5273b064e2","Type":"ContainerStarted","Data":"2e618864151e324772f8a9cd21feed81ce7fe6ac8d6898d960fe71d18a8be0b3"} Nov 24 09:30:23 crc kubenswrapper[4563]: I1124 09:30:23.333049 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" event={"ID":"aa741fe2-400c-479c-bfb3-0d5273b064e2","Type":"ContainerStarted","Data":"108add6df860766345c425aed4b50fa3e025464357bd495ca953a82b899e1e71"} Nov 24 09:30:23 crc kubenswrapper[4563]: I1124 09:30:23.353887 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" podStartSLOduration=1.867339954 podStartE2EDuration="2.353872691s" podCreationTimestamp="2025-11-24 09:30:21 +0000 UTC" firstStartedPulling="2025-11-24 09:30:22.433565501 +0000 UTC m=+1599.692542949" lastFinishedPulling="2025-11-24 09:30:22.920098239 +0000 UTC m=+1600.179075686" observedRunningTime="2025-11-24 09:30:23.345274636 +0000 UTC m=+1600.604252084" watchObservedRunningTime="2025-11-24 09:30:23.353872691 +0000 UTC m=+1600.612850139" Nov 24 09:30:25 crc kubenswrapper[4563]: I1124 09:30:25.021498 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c6q5k"] Nov 24 09:30:25 crc kubenswrapper[4563]: I1124 09:30:25.028301 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c6q5k"] Nov 24 09:30:25 crc kubenswrapper[4563]: I1124 09:30:25.062765 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355" path="/var/lib/kubelet/pods/fbebc6ce-2c30-4daa-8e0f-87f6ef9a5355/volumes" Nov 24 09:30:27 crc kubenswrapper[4563]: I1124 09:30:27.021082 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-fhd9d"] Nov 24 09:30:27 crc kubenswrapper[4563]: I1124 09:30:27.029325 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-fhd9d"] Nov 24 09:30:27 crc kubenswrapper[4563]: I1124 09:30:27.063750 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ad9a449-c587-419b-8f09-3fa89ed6a90b" path="/var/lib/kubelet/pods/0ad9a449-c587-419b-8f09-3fa89ed6a90b/volumes" Nov 24 09:30:38 crc kubenswrapper[4563]: I1124 09:30:38.987862 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:30:38 crc kubenswrapper[4563]: I1124 09:30:38.988296 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:30:46 crc kubenswrapper[4563]: I1124 09:30:46.857081 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7k4wv"] Nov 24 09:30:46 crc kubenswrapper[4563]: I1124 09:30:46.859612 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:46 crc kubenswrapper[4563]: I1124 09:30:46.867584 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7k4wv"] Nov 24 09:30:46 crc kubenswrapper[4563]: I1124 09:30:46.967363 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad52be4-4946-4ee4-9227-11cbefad5fda-catalog-content\") pod \"certified-operators-7k4wv\" (UID: \"2ad52be4-4946-4ee4-9227-11cbefad5fda\") " pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:46 crc kubenswrapper[4563]: I1124 09:30:46.967758 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqxkr\" (UniqueName: \"kubernetes.io/projected/2ad52be4-4946-4ee4-9227-11cbefad5fda-kube-api-access-pqxkr\") pod \"certified-operators-7k4wv\" (UID: \"2ad52be4-4946-4ee4-9227-11cbefad5fda\") " pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:46 crc kubenswrapper[4563]: I1124 09:30:46.967862 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad52be4-4946-4ee4-9227-11cbefad5fda-utilities\") pod \"certified-operators-7k4wv\" (UID: \"2ad52be4-4946-4ee4-9227-11cbefad5fda\") " pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.052233 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7pmhr"] Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.053919 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.069016 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad52be4-4946-4ee4-9227-11cbefad5fda-utilities\") pod \"certified-operators-7k4wv\" (UID: \"2ad52be4-4946-4ee4-9227-11cbefad5fda\") " pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.069091 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58fwc\" (UniqueName: \"kubernetes.io/projected/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-kube-api-access-58fwc\") pod \"community-operators-7pmhr\" (UID: \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\") " pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.069147 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-utilities\") pod \"community-operators-7pmhr\" (UID: \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\") " pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.069194 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad52be4-4946-4ee4-9227-11cbefad5fda-catalog-content\") pod \"certified-operators-7k4wv\" (UID: \"2ad52be4-4946-4ee4-9227-11cbefad5fda\") " pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.069260 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7pmhr"] Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.069269 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-catalog-content\") pod \"community-operators-7pmhr\" (UID: \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\") " pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.069354 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad52be4-4946-4ee4-9227-11cbefad5fda-utilities\") pod \"certified-operators-7k4wv\" (UID: \"2ad52be4-4946-4ee4-9227-11cbefad5fda\") " pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.069468 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqxkr\" (UniqueName: \"kubernetes.io/projected/2ad52be4-4946-4ee4-9227-11cbefad5fda-kube-api-access-pqxkr\") pod \"certified-operators-7k4wv\" (UID: \"2ad52be4-4946-4ee4-9227-11cbefad5fda\") " pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.069719 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad52be4-4946-4ee4-9227-11cbefad5fda-catalog-content\") pod \"certified-operators-7k4wv\" (UID: \"2ad52be4-4946-4ee4-9227-11cbefad5fda\") " pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.124408 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqxkr\" (UniqueName: \"kubernetes.io/projected/2ad52be4-4946-4ee4-9227-11cbefad5fda-kube-api-access-pqxkr\") pod \"certified-operators-7k4wv\" (UID: \"2ad52be4-4946-4ee4-9227-11cbefad5fda\") " pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.172435 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-catalog-content\") pod \"community-operators-7pmhr\" (UID: \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\") " pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.172622 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58fwc\" (UniqueName: \"kubernetes.io/projected/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-kube-api-access-58fwc\") pod \"community-operators-7pmhr\" (UID: \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\") " pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.172690 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-utilities\") pod \"community-operators-7pmhr\" (UID: \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\") " pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.173121 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-utilities\") pod \"community-operators-7pmhr\" (UID: \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\") " pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.173210 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-catalog-content\") pod \"community-operators-7pmhr\" (UID: \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\") " pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.173660 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.201309 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58fwc\" (UniqueName: \"kubernetes.io/projected/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-kube-api-access-58fwc\") pod \"community-operators-7pmhr\" (UID: \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\") " pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.371696 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.629956 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7k4wv"] Nov 24 09:30:47 crc kubenswrapper[4563]: I1124 09:30:47.903257 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7pmhr"] Nov 24 09:30:47 crc kubenswrapper[4563]: W1124 09:30:47.908349 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod533c3a59_9f5d_4ab0_8d5c_5e4880b30a2d.slice/crio-883798da485338122ae760c91b758184febf3b2b0dba24cfda8cb7874ed21e1c WatchSource:0}: Error finding container 883798da485338122ae760c91b758184febf3b2b0dba24cfda8cb7874ed21e1c: Status 404 returned error can't find the container with id 883798da485338122ae760c91b758184febf3b2b0dba24cfda8cb7874ed21e1c Nov 24 09:30:48 crc kubenswrapper[4563]: I1124 09:30:48.510978 4563 generic.go:334] "Generic (PLEG): container finished" podID="2ad52be4-4946-4ee4-9227-11cbefad5fda" containerID="3fdce3a2038db4a17e1b05ab2b71f316e8ffe492bad0f85c006d7b0db032fa06" exitCode=0 Nov 24 09:30:48 crc kubenswrapper[4563]: I1124 09:30:48.511028 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k4wv" event={"ID":"2ad52be4-4946-4ee4-9227-11cbefad5fda","Type":"ContainerDied","Data":"3fdce3a2038db4a17e1b05ab2b71f316e8ffe492bad0f85c006d7b0db032fa06"} Nov 24 09:30:48 crc kubenswrapper[4563]: I1124 09:30:48.511263 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k4wv" event={"ID":"2ad52be4-4946-4ee4-9227-11cbefad5fda","Type":"ContainerStarted","Data":"2b34b7a2571fd5507162e3c505f862d4da4f5202fb5b258161252e5bccc567b6"} Nov 24 09:30:48 crc kubenswrapper[4563]: I1124 09:30:48.513179 4563 generic.go:334] "Generic (PLEG): container finished" podID="533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" containerID="a219d1bb373f467bf5aa233965f584d7f97675e5151d25702acd4ed3a475bce7" exitCode=0 Nov 24 09:30:48 crc kubenswrapper[4563]: I1124 09:30:48.513207 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pmhr" event={"ID":"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d","Type":"ContainerDied","Data":"a219d1bb373f467bf5aa233965f584d7f97675e5151d25702acd4ed3a475bce7"} Nov 24 09:30:48 crc kubenswrapper[4563]: I1124 09:30:48.513224 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pmhr" event={"ID":"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d","Type":"ContainerStarted","Data":"883798da485338122ae760c91b758184febf3b2b0dba24cfda8cb7874ed21e1c"} Nov 24 09:30:49 crc kubenswrapper[4563]: I1124 09:30:49.525420 4563 generic.go:334] "Generic (PLEG): container finished" podID="2ad52be4-4946-4ee4-9227-11cbefad5fda" containerID="fbdf33aa5a498b18d198caa3b05b45d2d53c7b53a9c3cb1d0ebba5c09853e6d5" exitCode=0 Nov 24 09:30:49 crc kubenswrapper[4563]: I1124 09:30:49.525514 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k4wv" event={"ID":"2ad52be4-4946-4ee4-9227-11cbefad5fda","Type":"ContainerDied","Data":"fbdf33aa5a498b18d198caa3b05b45d2d53c7b53a9c3cb1d0ebba5c09853e6d5"} Nov 24 09:30:49 crc kubenswrapper[4563]: I1124 09:30:49.529543 4563 generic.go:334] "Generic (PLEG): container finished" podID="533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" containerID="b9abd529278d2e7d67ee2207114eae7dcf2136262cc96dd4d53d197d1572a792" exitCode=0 Nov 24 09:30:49 crc kubenswrapper[4563]: I1124 09:30:49.529578 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pmhr" event={"ID":"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d","Type":"ContainerDied","Data":"b9abd529278d2e7d67ee2207114eae7dcf2136262cc96dd4d53d197d1572a792"} Nov 24 09:30:50 crc kubenswrapper[4563]: I1124 09:30:50.539036 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pmhr" event={"ID":"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d","Type":"ContainerStarted","Data":"72d1fb9e57eb3f6ea700a21ac251cfc93ef71c82979b9230990891e0faa230d2"} Nov 24 09:30:50 crc kubenswrapper[4563]: I1124 09:30:50.542004 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k4wv" event={"ID":"2ad52be4-4946-4ee4-9227-11cbefad5fda","Type":"ContainerStarted","Data":"7c41646088ab95f860c48af17b51f557e7dedc6afff96d4c31a32de5af1206c9"} Nov 24 09:30:50 crc kubenswrapper[4563]: I1124 09:30:50.543593 4563 generic.go:334] "Generic (PLEG): container finished" podID="aa741fe2-400c-479c-bfb3-0d5273b064e2" containerID="2e618864151e324772f8a9cd21feed81ce7fe6ac8d6898d960fe71d18a8be0b3" exitCode=0 Nov 24 09:30:50 crc kubenswrapper[4563]: I1124 09:30:50.543623 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" event={"ID":"aa741fe2-400c-479c-bfb3-0d5273b064e2","Type":"ContainerDied","Data":"2e618864151e324772f8a9cd21feed81ce7fe6ac8d6898d960fe71d18a8be0b3"} Nov 24 09:30:50 crc kubenswrapper[4563]: I1124 09:30:50.558279 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7pmhr" podStartSLOduration=1.928185128 podStartE2EDuration="3.558270738s" podCreationTimestamp="2025-11-24 09:30:47 +0000 UTC" firstStartedPulling="2025-11-24 09:30:48.514667969 +0000 UTC m=+1625.773645416" lastFinishedPulling="2025-11-24 09:30:50.144753579 +0000 UTC m=+1627.403731026" observedRunningTime="2025-11-24 09:30:50.553081613 +0000 UTC m=+1627.812059059" watchObservedRunningTime="2025-11-24 09:30:50.558270738 +0000 UTC m=+1627.817248186" Nov 24 09:30:50 crc kubenswrapper[4563]: I1124 09:30:50.571524 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7k4wv" podStartSLOduration=3.090083087 podStartE2EDuration="4.571516408s" podCreationTimestamp="2025-11-24 09:30:46 +0000 UTC" firstStartedPulling="2025-11-24 09:30:48.512895225 +0000 UTC m=+1625.771872673" lastFinishedPulling="2025-11-24 09:30:49.994328547 +0000 UTC m=+1627.253305994" observedRunningTime="2025-11-24 09:30:50.566164956 +0000 UTC m=+1627.825142413" watchObservedRunningTime="2025-11-24 09:30:50.571516408 +0000 UTC m=+1627.830493855" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.862064 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.961043 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"aa741fe2-400c-479c-bfb3-0d5273b064e2\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.961080 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"aa741fe2-400c-479c-bfb3-0d5273b064e2\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.961120 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-libvirt-combined-ca-bundle\") pod \"aa741fe2-400c-479c-bfb3-0d5273b064e2\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.961163 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-ssh-key\") pod \"aa741fe2-400c-479c-bfb3-0d5273b064e2\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.961217 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"aa741fe2-400c-479c-bfb3-0d5273b064e2\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.961246 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xrjb\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-kube-api-access-7xrjb\") pod \"aa741fe2-400c-479c-bfb3-0d5273b064e2\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.961268 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-ovn-combined-ca-bundle\") pod \"aa741fe2-400c-479c-bfb3-0d5273b064e2\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.961325 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"aa741fe2-400c-479c-bfb3-0d5273b064e2\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.961347 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-inventory\") pod \"aa741fe2-400c-479c-bfb3-0d5273b064e2\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.961366 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-telemetry-combined-ca-bundle\") pod \"aa741fe2-400c-479c-bfb3-0d5273b064e2\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.961383 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-neutron-metadata-combined-ca-bundle\") pod \"aa741fe2-400c-479c-bfb3-0d5273b064e2\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.961420 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-bootstrap-combined-ca-bundle\") pod \"aa741fe2-400c-479c-bfb3-0d5273b064e2\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.961917 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-nova-combined-ca-bundle\") pod \"aa741fe2-400c-479c-bfb3-0d5273b064e2\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.961968 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-repo-setup-combined-ca-bundle\") pod \"aa741fe2-400c-479c-bfb3-0d5273b064e2\" (UID: \"aa741fe2-400c-479c-bfb3-0d5273b064e2\") " Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.966956 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-kube-api-access-7xrjb" (OuterVolumeSpecName: "kube-api-access-7xrjb") pod "aa741fe2-400c-479c-bfb3-0d5273b064e2" (UID: "aa741fe2-400c-479c-bfb3-0d5273b064e2"). InnerVolumeSpecName "kube-api-access-7xrjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.967599 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "aa741fe2-400c-479c-bfb3-0d5273b064e2" (UID: "aa741fe2-400c-479c-bfb3-0d5273b064e2"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.967651 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "aa741fe2-400c-479c-bfb3-0d5273b064e2" (UID: "aa741fe2-400c-479c-bfb3-0d5273b064e2"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.970071 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "aa741fe2-400c-479c-bfb3-0d5273b064e2" (UID: "aa741fe2-400c-479c-bfb3-0d5273b064e2"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.970096 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "aa741fe2-400c-479c-bfb3-0d5273b064e2" (UID: "aa741fe2-400c-479c-bfb3-0d5273b064e2"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.970168 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "aa741fe2-400c-479c-bfb3-0d5273b064e2" (UID: "aa741fe2-400c-479c-bfb3-0d5273b064e2"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.970204 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "aa741fe2-400c-479c-bfb3-0d5273b064e2" (UID: "aa741fe2-400c-479c-bfb3-0d5273b064e2"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.970282 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "aa741fe2-400c-479c-bfb3-0d5273b064e2" (UID: "aa741fe2-400c-479c-bfb3-0d5273b064e2"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.970462 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "aa741fe2-400c-479c-bfb3-0d5273b064e2" (UID: "aa741fe2-400c-479c-bfb3-0d5273b064e2"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.970485 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "aa741fe2-400c-479c-bfb3-0d5273b064e2" (UID: "aa741fe2-400c-479c-bfb3-0d5273b064e2"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.971038 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "aa741fe2-400c-479c-bfb3-0d5273b064e2" (UID: "aa741fe2-400c-479c-bfb3-0d5273b064e2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.971829 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "aa741fe2-400c-479c-bfb3-0d5273b064e2" (UID: "aa741fe2-400c-479c-bfb3-0d5273b064e2"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.988381 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aa741fe2-400c-479c-bfb3-0d5273b064e2" (UID: "aa741fe2-400c-479c-bfb3-0d5273b064e2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:30:51 crc kubenswrapper[4563]: I1124 09:30:51.988551 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-inventory" (OuterVolumeSpecName: "inventory") pod "aa741fe2-400c-479c-bfb3-0d5273b064e2" (UID: "aa741fe2-400c-479c-bfb3-0d5273b064e2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.064485 4563 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.064517 4563 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.064530 4563 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.064540 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.064551 4563 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.064561 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xrjb\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-kube-api-access-7xrjb\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.064569 4563 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.064578 4563 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/aa741fe2-400c-479c-bfb3-0d5273b064e2-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.064590 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.064599 4563 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.064609 4563 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.064618 4563 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.064626 4563 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.064634 4563 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa741fe2-400c-479c-bfb3-0d5273b064e2-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.280913 4563 scope.go:117] "RemoveContainer" containerID="6d437b26233b7501cdb08b51f1fac6eba877d7312f963513539d4a7684268d44" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.314791 4563 scope.go:117] "RemoveContainer" containerID="0be0346bf4becd1ff51770cad5e42c8beca3aae6fea47dd4defa408bcb9dad93" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.354661 4563 scope.go:117] "RemoveContainer" containerID="5a9ce07a645a3ee4fed06c306271eb75773a5e5411acca9077abbe25ed62eb05" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.561583 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" event={"ID":"aa741fe2-400c-479c-bfb3-0d5273b064e2","Type":"ContainerDied","Data":"108add6df860766345c425aed4b50fa3e025464357bd495ca953a82b899e1e71"} Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.561626 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="108add6df860766345c425aed4b50fa3e025464357bd495ca953a82b899e1e71" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.561674 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.672729 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm"] Nov 24 09:30:52 crc kubenswrapper[4563]: E1124 09:30:52.673277 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa741fe2-400c-479c-bfb3-0d5273b064e2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.673296 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa741fe2-400c-479c-bfb3-0d5273b064e2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.673542 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa741fe2-400c-479c-bfb3-0d5273b064e2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.674319 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.677931 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.677942 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.678240 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.678283 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.678390 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.682313 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm"] Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.780735 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.780880 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.780976 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.781093 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw8jt\" (UniqueName: \"kubernetes.io/projected/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-kube-api-access-dw8jt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.781189 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.881883 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.881966 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.882007 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.882053 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw8jt\" (UniqueName: \"kubernetes.io/projected/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-kube-api-access-dw8jt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.882081 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.883504 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.886103 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.886256 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.886919 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.897231 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw8jt\" (UniqueName: \"kubernetes.io/projected/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-kube-api-access-dw8jt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tt4mm\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:52 crc kubenswrapper[4563]: I1124 09:30:52.999944 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:30:53 crc kubenswrapper[4563]: I1124 09:30:53.430141 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm"] Nov 24 09:30:53 crc kubenswrapper[4563]: W1124 09:30:53.430913 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ab3a15a_af4c_43a3_9d3a_1515e2c8228b.slice/crio-035e1c445eaafe46a76ee474d0aa084abf1bfef9309b389d8a2b2ed69b195a24 WatchSource:0}: Error finding container 035e1c445eaafe46a76ee474d0aa084abf1bfef9309b389d8a2b2ed69b195a24: Status 404 returned error can't find the container with id 035e1c445eaafe46a76ee474d0aa084abf1bfef9309b389d8a2b2ed69b195a24 Nov 24 09:30:53 crc kubenswrapper[4563]: I1124 09:30:53.570430 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" event={"ID":"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b","Type":"ContainerStarted","Data":"035e1c445eaafe46a76ee474d0aa084abf1bfef9309b389d8a2b2ed69b195a24"} Nov 24 09:30:54 crc kubenswrapper[4563]: I1124 09:30:54.592300 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" event={"ID":"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b","Type":"ContainerStarted","Data":"f3b810cc3d0298c727825b55ff076d46332b0f334eb967c0e70977fade575124"} Nov 24 09:30:57 crc kubenswrapper[4563]: I1124 09:30:57.174448 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:57 crc kubenswrapper[4563]: I1124 09:30:57.174758 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:57 crc kubenswrapper[4563]: I1124 09:30:57.210778 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:57 crc kubenswrapper[4563]: I1124 09:30:57.229358 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" podStartSLOduration=4.766873318 podStartE2EDuration="5.229343988s" podCreationTimestamp="2025-11-24 09:30:52 +0000 UTC" firstStartedPulling="2025-11-24 09:30:53.433625846 +0000 UTC m=+1630.692603293" lastFinishedPulling="2025-11-24 09:30:53.896096515 +0000 UTC m=+1631.155073963" observedRunningTime="2025-11-24 09:30:54.612941602 +0000 UTC m=+1631.871919049" watchObservedRunningTime="2025-11-24 09:30:57.229343988 +0000 UTC m=+1634.488321436" Nov 24 09:30:57 crc kubenswrapper[4563]: I1124 09:30:57.371888 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:57 crc kubenswrapper[4563]: I1124 09:30:57.371972 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:57 crc kubenswrapper[4563]: I1124 09:30:57.406099 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:57 crc kubenswrapper[4563]: I1124 09:30:57.647421 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:30:57 crc kubenswrapper[4563]: I1124 09:30:57.648299 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:30:58 crc kubenswrapper[4563]: I1124 09:30:58.243928 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7pmhr"] Nov 24 09:30:59 crc kubenswrapper[4563]: I1124 09:30:59.626927 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7pmhr" podUID="533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" containerName="registry-server" containerID="cri-o://72d1fb9e57eb3f6ea700a21ac251cfc93ef71c82979b9230990891e0faa230d2" gracePeriod=2 Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.000129 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.042706 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7k4wv"] Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.042908 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7k4wv" podUID="2ad52be4-4946-4ee4-9227-11cbefad5fda" containerName="registry-server" containerID="cri-o://7c41646088ab95f860c48af17b51f557e7dedc6afff96d4c31a32de5af1206c9" gracePeriod=2 Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.114120 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-catalog-content\") pod \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\" (UID: \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\") " Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.114189 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58fwc\" (UniqueName: \"kubernetes.io/projected/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-kube-api-access-58fwc\") pod \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\" (UID: \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\") " Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.114229 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-utilities\") pod \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\" (UID: \"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d\") " Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.115258 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-utilities" (OuterVolumeSpecName: "utilities") pod "533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" (UID: "533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.120943 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-kube-api-access-58fwc" (OuterVolumeSpecName: "kube-api-access-58fwc") pod "533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" (UID: "533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d"). InnerVolumeSpecName "kube-api-access-58fwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.216044 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58fwc\" (UniqueName: \"kubernetes.io/projected/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-kube-api-access-58fwc\") on node \"crc\" DevicePath \"\"" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.216191 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.440770 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" (UID: "533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.524945 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.637072 4563 generic.go:334] "Generic (PLEG): container finished" podID="2ad52be4-4946-4ee4-9227-11cbefad5fda" containerID="7c41646088ab95f860c48af17b51f557e7dedc6afff96d4c31a32de5af1206c9" exitCode=0 Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.637138 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k4wv" event={"ID":"2ad52be4-4946-4ee4-9227-11cbefad5fda","Type":"ContainerDied","Data":"7c41646088ab95f860c48af17b51f557e7dedc6afff96d4c31a32de5af1206c9"} Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.640237 4563 generic.go:334] "Generic (PLEG): container finished" podID="533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" containerID="72d1fb9e57eb3f6ea700a21ac251cfc93ef71c82979b9230990891e0faa230d2" exitCode=0 Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.640283 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pmhr" event={"ID":"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d","Type":"ContainerDied","Data":"72d1fb9e57eb3f6ea700a21ac251cfc93ef71c82979b9230990891e0faa230d2"} Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.640317 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pmhr" event={"ID":"533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d","Type":"ContainerDied","Data":"883798da485338122ae760c91b758184febf3b2b0dba24cfda8cb7874ed21e1c"} Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.640327 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pmhr" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.640335 4563 scope.go:117] "RemoveContainer" containerID="72d1fb9e57eb3f6ea700a21ac251cfc93ef71c82979b9230990891e0faa230d2" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.663041 4563 scope.go:117] "RemoveContainer" containerID="b9abd529278d2e7d67ee2207114eae7dcf2136262cc96dd4d53d197d1572a792" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.665536 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7pmhr"] Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.670866 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7pmhr"] Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.703935 4563 scope.go:117] "RemoveContainer" containerID="a219d1bb373f467bf5aa233965f584d7f97675e5151d25702acd4ed3a475bce7" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.716401 4563 scope.go:117] "RemoveContainer" containerID="72d1fb9e57eb3f6ea700a21ac251cfc93ef71c82979b9230990891e0faa230d2" Nov 24 09:31:00 crc kubenswrapper[4563]: E1124 09:31:00.716706 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d1fb9e57eb3f6ea700a21ac251cfc93ef71c82979b9230990891e0faa230d2\": container with ID starting with 72d1fb9e57eb3f6ea700a21ac251cfc93ef71c82979b9230990891e0faa230d2 not found: ID does not exist" containerID="72d1fb9e57eb3f6ea700a21ac251cfc93ef71c82979b9230990891e0faa230d2" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.716738 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d1fb9e57eb3f6ea700a21ac251cfc93ef71c82979b9230990891e0faa230d2"} err="failed to get container status \"72d1fb9e57eb3f6ea700a21ac251cfc93ef71c82979b9230990891e0faa230d2\": rpc error: code = NotFound desc = could not find container \"72d1fb9e57eb3f6ea700a21ac251cfc93ef71c82979b9230990891e0faa230d2\": container with ID starting with 72d1fb9e57eb3f6ea700a21ac251cfc93ef71c82979b9230990891e0faa230d2 not found: ID does not exist" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.716758 4563 scope.go:117] "RemoveContainer" containerID="b9abd529278d2e7d67ee2207114eae7dcf2136262cc96dd4d53d197d1572a792" Nov 24 09:31:00 crc kubenswrapper[4563]: E1124 09:31:00.716995 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9abd529278d2e7d67ee2207114eae7dcf2136262cc96dd4d53d197d1572a792\": container with ID starting with b9abd529278d2e7d67ee2207114eae7dcf2136262cc96dd4d53d197d1572a792 not found: ID does not exist" containerID="b9abd529278d2e7d67ee2207114eae7dcf2136262cc96dd4d53d197d1572a792" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.717021 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9abd529278d2e7d67ee2207114eae7dcf2136262cc96dd4d53d197d1572a792"} err="failed to get container status \"b9abd529278d2e7d67ee2207114eae7dcf2136262cc96dd4d53d197d1572a792\": rpc error: code = NotFound desc = could not find container \"b9abd529278d2e7d67ee2207114eae7dcf2136262cc96dd4d53d197d1572a792\": container with ID starting with b9abd529278d2e7d67ee2207114eae7dcf2136262cc96dd4d53d197d1572a792 not found: ID does not exist" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.717037 4563 scope.go:117] "RemoveContainer" containerID="a219d1bb373f467bf5aa233965f584d7f97675e5151d25702acd4ed3a475bce7" Nov 24 09:31:00 crc kubenswrapper[4563]: E1124 09:31:00.717247 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a219d1bb373f467bf5aa233965f584d7f97675e5151d25702acd4ed3a475bce7\": container with ID starting with a219d1bb373f467bf5aa233965f584d7f97675e5151d25702acd4ed3a475bce7 not found: ID does not exist" containerID="a219d1bb373f467bf5aa233965f584d7f97675e5151d25702acd4ed3a475bce7" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.717266 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a219d1bb373f467bf5aa233965f584d7f97675e5151d25702acd4ed3a475bce7"} err="failed to get container status \"a219d1bb373f467bf5aa233965f584d7f97675e5151d25702acd4ed3a475bce7\": rpc error: code = NotFound desc = could not find container \"a219d1bb373f467bf5aa233965f584d7f97675e5151d25702acd4ed3a475bce7\": container with ID starting with a219d1bb373f467bf5aa233965f584d7f97675e5151d25702acd4ed3a475bce7 not found: ID does not exist" Nov 24 09:31:00 crc kubenswrapper[4563]: I1124 09:31:00.930614 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.032741 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad52be4-4946-4ee4-9227-11cbefad5fda-catalog-content\") pod \"2ad52be4-4946-4ee4-9227-11cbefad5fda\" (UID: \"2ad52be4-4946-4ee4-9227-11cbefad5fda\") " Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.032863 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad52be4-4946-4ee4-9227-11cbefad5fda-utilities\") pod \"2ad52be4-4946-4ee4-9227-11cbefad5fda\" (UID: \"2ad52be4-4946-4ee4-9227-11cbefad5fda\") " Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.032951 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqxkr\" (UniqueName: \"kubernetes.io/projected/2ad52be4-4946-4ee4-9227-11cbefad5fda-kube-api-access-pqxkr\") pod \"2ad52be4-4946-4ee4-9227-11cbefad5fda\" (UID: \"2ad52be4-4946-4ee4-9227-11cbefad5fda\") " Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.033662 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ad52be4-4946-4ee4-9227-11cbefad5fda-utilities" (OuterVolumeSpecName: "utilities") pod "2ad52be4-4946-4ee4-9227-11cbefad5fda" (UID: "2ad52be4-4946-4ee4-9227-11cbefad5fda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.034656 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad52be4-4946-4ee4-9227-11cbefad5fda-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.038117 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ad52be4-4946-4ee4-9227-11cbefad5fda-kube-api-access-pqxkr" (OuterVolumeSpecName: "kube-api-access-pqxkr") pod "2ad52be4-4946-4ee4-9227-11cbefad5fda" (UID: "2ad52be4-4946-4ee4-9227-11cbefad5fda"). InnerVolumeSpecName "kube-api-access-pqxkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.064846 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" path="/var/lib/kubelet/pods/533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d/volumes" Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.066972 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ad52be4-4946-4ee4-9227-11cbefad5fda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ad52be4-4946-4ee4-9227-11cbefad5fda" (UID: "2ad52be4-4946-4ee4-9227-11cbefad5fda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.136611 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad52be4-4946-4ee4-9227-11cbefad5fda-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.136654 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqxkr\" (UniqueName: \"kubernetes.io/projected/2ad52be4-4946-4ee4-9227-11cbefad5fda-kube-api-access-pqxkr\") on node \"crc\" DevicePath \"\"" Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.652249 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7k4wv" event={"ID":"2ad52be4-4946-4ee4-9227-11cbefad5fda","Type":"ContainerDied","Data":"2b34b7a2571fd5507162e3c505f862d4da4f5202fb5b258161252e5bccc567b6"} Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.652290 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7k4wv" Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.652544 4563 scope.go:117] "RemoveContainer" containerID="7c41646088ab95f860c48af17b51f557e7dedc6afff96d4c31a32de5af1206c9" Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.674085 4563 scope.go:117] "RemoveContainer" containerID="fbdf33aa5a498b18d198caa3b05b45d2d53c7b53a9c3cb1d0ebba5c09853e6d5" Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.676102 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7k4wv"] Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.682109 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7k4wv"] Nov 24 09:31:01 crc kubenswrapper[4563]: I1124 09:31:01.697467 4563 scope.go:117] "RemoveContainer" containerID="3fdce3a2038db4a17e1b05ab2b71f316e8ffe492bad0f85c006d7b0db032fa06" Nov 24 09:31:03 crc kubenswrapper[4563]: I1124 09:31:03.065874 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ad52be4-4946-4ee4-9227-11cbefad5fda" path="/var/lib/kubelet/pods/2ad52be4-4946-4ee4-9227-11cbefad5fda/volumes" Nov 24 09:31:08 crc kubenswrapper[4563]: I1124 09:31:08.987501 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:31:08 crc kubenswrapper[4563]: I1124 09:31:08.987930 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:31:12 crc kubenswrapper[4563]: I1124 09:31:12.026435 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-94qxv"] Nov 24 09:31:12 crc kubenswrapper[4563]: I1124 09:31:12.031344 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-94qxv"] Nov 24 09:31:13 crc kubenswrapper[4563]: I1124 09:31:13.062205 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f81c63b6-56cb-41c4-b2ad-8eb1d99419d8" path="/var/lib/kubelet/pods/f81c63b6-56cb-41c4-b2ad-8eb1d99419d8/volumes" Nov 24 09:31:35 crc kubenswrapper[4563]: I1124 09:31:35.899624 4563 generic.go:334] "Generic (PLEG): container finished" podID="5ab3a15a-af4c-43a3-9d3a-1515e2c8228b" containerID="f3b810cc3d0298c727825b55ff076d46332b0f334eb967c0e70977fade575124" exitCode=0 Nov 24 09:31:35 crc kubenswrapper[4563]: I1124 09:31:35.899679 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" event={"ID":"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b","Type":"ContainerDied","Data":"f3b810cc3d0298c727825b55ff076d46332b0f334eb967c0e70977fade575124"} Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.252712 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.444343 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-inventory\") pod \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.444384 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ssh-key\") pod \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.444407 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ovncontroller-config-0\") pod \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.444456 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ovn-combined-ca-bundle\") pod \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.444493 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw8jt\" (UniqueName: \"kubernetes.io/projected/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-kube-api-access-dw8jt\") pod \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\" (UID: \"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b\") " Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.451086 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5ab3a15a-af4c-43a3-9d3a-1515e2c8228b" (UID: "5ab3a15a-af4c-43a3-9d3a-1515e2c8228b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.451394 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-kube-api-access-dw8jt" (OuterVolumeSpecName: "kube-api-access-dw8jt") pod "5ab3a15a-af4c-43a3-9d3a-1515e2c8228b" (UID: "5ab3a15a-af4c-43a3-9d3a-1515e2c8228b"). InnerVolumeSpecName "kube-api-access-dw8jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.469406 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "5ab3a15a-af4c-43a3-9d3a-1515e2c8228b" (UID: "5ab3a15a-af4c-43a3-9d3a-1515e2c8228b"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.472177 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-inventory" (OuterVolumeSpecName: "inventory") pod "5ab3a15a-af4c-43a3-9d3a-1515e2c8228b" (UID: "5ab3a15a-af4c-43a3-9d3a-1515e2c8228b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.472276 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5ab3a15a-af4c-43a3-9d3a-1515e2c8228b" (UID: "5ab3a15a-af4c-43a3-9d3a-1515e2c8228b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.548965 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.549006 4563 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.549021 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.549036 4563 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.549047 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw8jt\" (UniqueName: \"kubernetes.io/projected/5ab3a15a-af4c-43a3-9d3a-1515e2c8228b-kube-api-access-dw8jt\") on node \"crc\" DevicePath \"\"" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.926230 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" event={"ID":"5ab3a15a-af4c-43a3-9d3a-1515e2c8228b","Type":"ContainerDied","Data":"035e1c445eaafe46a76ee474d0aa084abf1bfef9309b389d8a2b2ed69b195a24"} Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.926270 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="035e1c445eaafe46a76ee474d0aa084abf1bfef9309b389d8a2b2ed69b195a24" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.926364 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tt4mm" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.997226 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2"] Nov 24 09:31:37 crc kubenswrapper[4563]: E1124 09:31:37.998244 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" containerName="extract-content" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.998312 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" containerName="extract-content" Nov 24 09:31:37 crc kubenswrapper[4563]: E1124 09:31:37.998462 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab3a15a-af4c-43a3-9d3a-1515e2c8228b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.998511 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab3a15a-af4c-43a3-9d3a-1515e2c8228b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 24 09:31:37 crc kubenswrapper[4563]: E1124 09:31:37.998588 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ad52be4-4946-4ee4-9227-11cbefad5fda" containerName="extract-utilities" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.998650 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ad52be4-4946-4ee4-9227-11cbefad5fda" containerName="extract-utilities" Nov 24 09:31:37 crc kubenswrapper[4563]: E1124 09:31:37.998717 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ad52be4-4946-4ee4-9227-11cbefad5fda" containerName="registry-server" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.998774 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ad52be4-4946-4ee4-9227-11cbefad5fda" containerName="registry-server" Nov 24 09:31:37 crc kubenswrapper[4563]: E1124 09:31:37.998831 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" containerName="extract-utilities" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.998880 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" containerName="extract-utilities" Nov 24 09:31:37 crc kubenswrapper[4563]: E1124 09:31:37.998931 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ad52be4-4946-4ee4-9227-11cbefad5fda" containerName="extract-content" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.998972 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ad52be4-4946-4ee4-9227-11cbefad5fda" containerName="extract-content" Nov 24 09:31:37 crc kubenswrapper[4563]: E1124 09:31:37.999019 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" containerName="registry-server" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.999060 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" containerName="registry-server" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.999312 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ad52be4-4946-4ee4-9227-11cbefad5fda" containerName="registry-server" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.999384 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ab3a15a-af4c-43a3-9d3a-1515e2c8228b" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 24 09:31:37 crc kubenswrapper[4563]: I1124 09:31:37.999444 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="533c3a59-9f5d-4ab0-8d5c-5e4880b30a2d" containerName="registry-server" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.000310 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.003198 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.003393 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.003496 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.003812 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.004253 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.004292 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.007187 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2"] Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.161680 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.161736 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbw67\" (UniqueName: \"kubernetes.io/projected/5c13fe46-9855-4291-b685-df5de9abafa7-kube-api-access-sbw67\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.161906 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.161941 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.161982 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.162008 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.265358 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.265483 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.265555 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.265597 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.265700 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.265731 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbw67\" (UniqueName: \"kubernetes.io/projected/5c13fe46-9855-4291-b685-df5de9abafa7-kube-api-access-sbw67\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.269111 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.270121 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.270440 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.270845 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.271882 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.279554 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbw67\" (UniqueName: \"kubernetes.io/projected/5c13fe46-9855-4291-b685-df5de9abafa7-kube-api-access-sbw67\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.314377 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.780092 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2"] Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.934926 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" event={"ID":"5c13fe46-9855-4291-b685-df5de9abafa7","Type":"ContainerStarted","Data":"e66b88eeac600c3229489fb21d7fa2af53e2c99e28431dc7974c5c5748654fb3"} Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.987484 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.987564 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.987659 4563 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.988342 4563 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512"} pod="openshift-machine-config-operator/machine-config-daemon-stlxr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 09:31:38 crc kubenswrapper[4563]: I1124 09:31:38.988421 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" containerID="cri-o://bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" gracePeriod=600 Nov 24 09:31:39 crc kubenswrapper[4563]: E1124 09:31:39.105758 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:31:39 crc kubenswrapper[4563]: I1124 09:31:39.943840 4563 generic.go:334] "Generic (PLEG): container finished" podID="3b2bfe55-8989-49b3-bb61-e28189447627" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" exitCode=0 Nov 24 09:31:39 crc kubenswrapper[4563]: I1124 09:31:39.943903 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerDied","Data":"bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512"} Nov 24 09:31:39 crc kubenswrapper[4563]: I1124 09:31:39.945603 4563 scope.go:117] "RemoveContainer" containerID="4d8f48825147068e682924024fb98e71a696f2055c921253ed4d8afbad01ed41" Nov 24 09:31:39 crc kubenswrapper[4563]: I1124 09:31:39.945905 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:31:39 crc kubenswrapper[4563]: E1124 09:31:39.946236 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:31:39 crc kubenswrapper[4563]: I1124 09:31:39.946977 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" event={"ID":"5c13fe46-9855-4291-b685-df5de9abafa7","Type":"ContainerStarted","Data":"8f64d716f8c73e14d76723b9d9d3f0bdfe492610bf3267b3f56d7cd417d85669"} Nov 24 09:31:39 crc kubenswrapper[4563]: I1124 09:31:39.976831 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" podStartSLOduration=2.457655419 podStartE2EDuration="2.976813153s" podCreationTimestamp="2025-11-24 09:31:37 +0000 UTC" firstStartedPulling="2025-11-24 09:31:38.785739112 +0000 UTC m=+1676.044716559" lastFinishedPulling="2025-11-24 09:31:39.304896846 +0000 UTC m=+1676.563874293" observedRunningTime="2025-11-24 09:31:39.970878982 +0000 UTC m=+1677.229856429" watchObservedRunningTime="2025-11-24 09:31:39.976813153 +0000 UTC m=+1677.235790600" Nov 24 09:31:51 crc kubenswrapper[4563]: I1124 09:31:51.055036 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:31:51 crc kubenswrapper[4563]: E1124 09:31:51.057373 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:31:52 crc kubenswrapper[4563]: I1124 09:31:52.440461 4563 scope.go:117] "RemoveContainer" containerID="a4aa16d5c3909ca3c0970da3ca2011671b929e433a1a535f46766005bcb45570" Nov 24 09:32:04 crc kubenswrapper[4563]: I1124 09:32:04.055462 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:32:04 crc kubenswrapper[4563]: E1124 09:32:04.056273 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:32:13 crc kubenswrapper[4563]: I1124 09:32:13.234731 4563 generic.go:334] "Generic (PLEG): container finished" podID="5c13fe46-9855-4291-b685-df5de9abafa7" containerID="8f64d716f8c73e14d76723b9d9d3f0bdfe492610bf3267b3f56d7cd417d85669" exitCode=0 Nov 24 09:32:13 crc kubenswrapper[4563]: I1124 09:32:13.234831 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" event={"ID":"5c13fe46-9855-4291-b685-df5de9abafa7","Type":"ContainerDied","Data":"8f64d716f8c73e14d76723b9d9d3f0bdfe492610bf3267b3f56d7cd417d85669"} Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.574056 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.684862 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbw67\" (UniqueName: \"kubernetes.io/projected/5c13fe46-9855-4291-b685-df5de9abafa7-kube-api-access-sbw67\") pod \"5c13fe46-9855-4291-b685-df5de9abafa7\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.684928 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-nova-metadata-neutron-config-0\") pod \"5c13fe46-9855-4291-b685-df5de9abafa7\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.684980 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"5c13fe46-9855-4291-b685-df5de9abafa7\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.685036 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-ssh-key\") pod \"5c13fe46-9855-4291-b685-df5de9abafa7\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.685124 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-neutron-metadata-combined-ca-bundle\") pod \"5c13fe46-9855-4291-b685-df5de9abafa7\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.685176 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-inventory\") pod \"5c13fe46-9855-4291-b685-df5de9abafa7\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.691657 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c13fe46-9855-4291-b685-df5de9abafa7-kube-api-access-sbw67" (OuterVolumeSpecName: "kube-api-access-sbw67") pod "5c13fe46-9855-4291-b685-df5de9abafa7" (UID: "5c13fe46-9855-4291-b685-df5de9abafa7"). InnerVolumeSpecName "kube-api-access-sbw67". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.691847 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5c13fe46-9855-4291-b685-df5de9abafa7" (UID: "5c13fe46-9855-4291-b685-df5de9abafa7"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:32:14 crc kubenswrapper[4563]: E1124 09:32:14.707739 4563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-inventory podName:5c13fe46-9855-4291-b685-df5de9abafa7 nodeName:}" failed. No retries permitted until 2025-11-24 09:32:15.207716348 +0000 UTC m=+1712.466693795 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-inventory") pod "5c13fe46-9855-4291-b685-df5de9abafa7" (UID: "5c13fe46-9855-4291-b685-df5de9abafa7") : error deleting /var/lib/kubelet/pods/5c13fe46-9855-4291-b685-df5de9abafa7/volume-subpaths: remove /var/lib/kubelet/pods/5c13fe46-9855-4291-b685-df5de9abafa7/volume-subpaths: no such file or directory Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.709904 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5c13fe46-9855-4291-b685-df5de9abafa7" (UID: "5c13fe46-9855-4291-b685-df5de9abafa7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.710323 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "5c13fe46-9855-4291-b685-df5de9abafa7" (UID: "5c13fe46-9855-4291-b685-df5de9abafa7"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.710344 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "5c13fe46-9855-4291-b685-df5de9abafa7" (UID: "5c13fe46-9855-4291-b685-df5de9abafa7"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.786841 4563 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.786874 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbw67\" (UniqueName: \"kubernetes.io/projected/5c13fe46-9855-4291-b685-df5de9abafa7-kube-api-access-sbw67\") on node \"crc\" DevicePath \"\"" Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.786885 4563 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.786895 4563 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:32:14 crc kubenswrapper[4563]: I1124 09:32:14.786908 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.255676 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" event={"ID":"5c13fe46-9855-4291-b685-df5de9abafa7","Type":"ContainerDied","Data":"e66b88eeac600c3229489fb21d7fa2af53e2c99e28431dc7974c5c5748654fb3"} Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.255732 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e66b88eeac600c3229489fb21d7fa2af53e2c99e28431dc7974c5c5748654fb3" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.255762 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.295370 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-inventory\") pod \"5c13fe46-9855-4291-b685-df5de9abafa7\" (UID: \"5c13fe46-9855-4291-b685-df5de9abafa7\") " Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.298921 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-inventory" (OuterVolumeSpecName: "inventory") pod "5c13fe46-9855-4291-b685-df5de9abafa7" (UID: "5c13fe46-9855-4291-b685-df5de9abafa7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.321176 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62"] Nov 24 09:32:15 crc kubenswrapper[4563]: E1124 09:32:15.321651 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c13fe46-9855-4291-b685-df5de9abafa7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.321670 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c13fe46-9855-4291-b685-df5de9abafa7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.321956 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c13fe46-9855-4291-b685-df5de9abafa7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.322625 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.324528 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.341614 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62"] Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.397289 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.397548 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.397822 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.397943 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.398013 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5z5s\" (UniqueName: \"kubernetes.io/projected/96a07419-7337-47f5-89aa-233e06eec048-kube-api-access-c5z5s\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.398151 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c13fe46-9855-4291-b685-df5de9abafa7-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.499484 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.499553 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.499576 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5z5s\" (UniqueName: \"kubernetes.io/projected/96a07419-7337-47f5-89aa-233e06eec048-kube-api-access-c5z5s\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.499677 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.499756 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.503235 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.504449 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.504589 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.504842 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.514564 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5z5s\" (UniqueName: \"kubernetes.io/projected/96a07419-7337-47f5-89aa-233e06eec048-kube-api-access-c5z5s\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z8q62\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:15 crc kubenswrapper[4563]: I1124 09:32:15.655567 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:32:16 crc kubenswrapper[4563]: I1124 09:32:16.108709 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62"] Nov 24 09:32:16 crc kubenswrapper[4563]: W1124 09:32:16.111515 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96a07419_7337_47f5_89aa_233e06eec048.slice/crio-0ec19b9d047ef442de5ef6701091837d6b8d23a5faed27662dad21a62b899624 WatchSource:0}: Error finding container 0ec19b9d047ef442de5ef6701091837d6b8d23a5faed27662dad21a62b899624: Status 404 returned error can't find the container with id 0ec19b9d047ef442de5ef6701091837d6b8d23a5faed27662dad21a62b899624 Nov 24 09:32:16 crc kubenswrapper[4563]: I1124 09:32:16.262625 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" event={"ID":"96a07419-7337-47f5-89aa-233e06eec048","Type":"ContainerStarted","Data":"0ec19b9d047ef442de5ef6701091837d6b8d23a5faed27662dad21a62b899624"} Nov 24 09:32:17 crc kubenswrapper[4563]: I1124 09:32:17.270460 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" event={"ID":"96a07419-7337-47f5-89aa-233e06eec048","Type":"ContainerStarted","Data":"528847efd7bab73e5849834e5a94c2296ed6b207aafc288180a7a0911deb11d1"} Nov 24 09:32:17 crc kubenswrapper[4563]: I1124 09:32:17.285812 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" podStartSLOduration=1.747981303 podStartE2EDuration="2.285795479s" podCreationTimestamp="2025-11-24 09:32:15 +0000 UTC" firstStartedPulling="2025-11-24 09:32:16.11360827 +0000 UTC m=+1713.372585718" lastFinishedPulling="2025-11-24 09:32:16.651422447 +0000 UTC m=+1713.910399894" observedRunningTime="2025-11-24 09:32:17.285420282 +0000 UTC m=+1714.544397728" watchObservedRunningTime="2025-11-24 09:32:17.285795479 +0000 UTC m=+1714.544772926" Nov 24 09:32:18 crc kubenswrapper[4563]: I1124 09:32:18.054772 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:32:18 crc kubenswrapper[4563]: E1124 09:32:18.055004 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:32:32 crc kubenswrapper[4563]: I1124 09:32:32.054665 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:32:32 crc kubenswrapper[4563]: E1124 09:32:32.055538 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:32:47 crc kubenswrapper[4563]: I1124 09:32:47.054772 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:32:47 crc kubenswrapper[4563]: E1124 09:32:47.055499 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:32:58 crc kubenswrapper[4563]: I1124 09:32:58.054989 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:32:58 crc kubenswrapper[4563]: E1124 09:32:58.055515 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:33:10 crc kubenswrapper[4563]: I1124 09:33:10.054569 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:33:10 crc kubenswrapper[4563]: E1124 09:33:10.055061 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:33:23 crc kubenswrapper[4563]: I1124 09:33:23.058775 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:33:23 crc kubenswrapper[4563]: E1124 09:33:23.059328 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:33:35 crc kubenswrapper[4563]: I1124 09:33:35.054868 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:33:35 crc kubenswrapper[4563]: E1124 09:33:35.055609 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:33:46 crc kubenswrapper[4563]: I1124 09:33:46.054907 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:33:46 crc kubenswrapper[4563]: E1124 09:33:46.056862 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:33:58 crc kubenswrapper[4563]: I1124 09:33:58.054686 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:33:58 crc kubenswrapper[4563]: E1124 09:33:58.055425 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:34:11 crc kubenswrapper[4563]: I1124 09:34:11.055992 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:34:11 crc kubenswrapper[4563]: E1124 09:34:11.056949 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:34:22 crc kubenswrapper[4563]: I1124 09:34:22.055276 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:34:22 crc kubenswrapper[4563]: E1124 09:34:22.055771 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:34:37 crc kubenswrapper[4563]: I1124 09:34:37.054387 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:34:37 crc kubenswrapper[4563]: E1124 09:34:37.055140 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:34:52 crc kubenswrapper[4563]: I1124 09:34:52.055044 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:34:52 crc kubenswrapper[4563]: E1124 09:34:52.055748 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:35:03 crc kubenswrapper[4563]: I1124 09:35:03.066865 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:35:03 crc kubenswrapper[4563]: E1124 09:35:03.067787 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:35:16 crc kubenswrapper[4563]: I1124 09:35:16.054506 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:35:16 crc kubenswrapper[4563]: E1124 09:35:16.056224 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:35:29 crc kubenswrapper[4563]: I1124 09:35:29.578399 4563 generic.go:334] "Generic (PLEG): container finished" podID="96a07419-7337-47f5-89aa-233e06eec048" containerID="528847efd7bab73e5849834e5a94c2296ed6b207aafc288180a7a0911deb11d1" exitCode=0 Nov 24 09:35:29 crc kubenswrapper[4563]: I1124 09:35:29.578488 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" event={"ID":"96a07419-7337-47f5-89aa-233e06eec048","Type":"ContainerDied","Data":"528847efd7bab73e5849834e5a94c2296ed6b207aafc288180a7a0911deb11d1"} Nov 24 09:35:30 crc kubenswrapper[4563]: I1124 09:35:30.054926 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:35:30 crc kubenswrapper[4563]: E1124 09:35:30.055344 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:35:30 crc kubenswrapper[4563]: I1124 09:35:30.916471 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.088667 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5z5s\" (UniqueName: \"kubernetes.io/projected/96a07419-7337-47f5-89aa-233e06eec048-kube-api-access-c5z5s\") pod \"96a07419-7337-47f5-89aa-233e06eec048\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.088773 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-ssh-key\") pod \"96a07419-7337-47f5-89aa-233e06eec048\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.088800 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-libvirt-combined-ca-bundle\") pod \"96a07419-7337-47f5-89aa-233e06eec048\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.089052 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-libvirt-secret-0\") pod \"96a07419-7337-47f5-89aa-233e06eec048\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.089129 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-inventory\") pod \"96a07419-7337-47f5-89aa-233e06eec048\" (UID: \"96a07419-7337-47f5-89aa-233e06eec048\") " Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.094143 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "96a07419-7337-47f5-89aa-233e06eec048" (UID: "96a07419-7337-47f5-89aa-233e06eec048"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.094319 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a07419-7337-47f5-89aa-233e06eec048-kube-api-access-c5z5s" (OuterVolumeSpecName: "kube-api-access-c5z5s") pod "96a07419-7337-47f5-89aa-233e06eec048" (UID: "96a07419-7337-47f5-89aa-233e06eec048"). InnerVolumeSpecName "kube-api-access-c5z5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.109969 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-inventory" (OuterVolumeSpecName: "inventory") pod "96a07419-7337-47f5-89aa-233e06eec048" (UID: "96a07419-7337-47f5-89aa-233e06eec048"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.110559 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "96a07419-7337-47f5-89aa-233e06eec048" (UID: "96a07419-7337-47f5-89aa-233e06eec048"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.116313 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "96a07419-7337-47f5-89aa-233e06eec048" (UID: "96a07419-7337-47f5-89aa-233e06eec048"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.192297 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.192328 4563 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.192341 4563 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.192349 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96a07419-7337-47f5-89aa-233e06eec048-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.192358 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5z5s\" (UniqueName: \"kubernetes.io/projected/96a07419-7337-47f5-89aa-233e06eec048-kube-api-access-c5z5s\") on node \"crc\" DevicePath \"\"" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.602034 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" event={"ID":"96a07419-7337-47f5-89aa-233e06eec048","Type":"ContainerDied","Data":"0ec19b9d047ef442de5ef6701091837d6b8d23a5faed27662dad21a62b899624"} Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.602356 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ec19b9d047ef442de5ef6701091837d6b8d23a5faed27662dad21a62b899624" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.602081 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z8q62" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.679208 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm"] Nov 24 09:35:31 crc kubenswrapper[4563]: E1124 09:35:31.679542 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a07419-7337-47f5-89aa-233e06eec048" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.679561 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a07419-7337-47f5-89aa-233e06eec048" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.679757 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a07419-7337-47f5-89aa-233e06eec048" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.680287 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.682540 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.682830 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.682916 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.682986 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.683149 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.683218 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.683313 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.692106 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm"] Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.701440 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.701505 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h48f\" (UniqueName: \"kubernetes.io/projected/49ccd723-5c1a-4763-9eb4-5aed7651bad5-kube-api-access-5h48f\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.701574 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.701699 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.701773 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.701883 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.701937 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.701976 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.702121 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.805423 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.805538 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.805583 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.805623 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.805713 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.805783 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.805807 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h48f\" (UniqueName: \"kubernetes.io/projected/49ccd723-5c1a-4763-9eb4-5aed7651bad5-kube-api-access-5h48f\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.805833 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.805866 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.806731 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.809176 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.810303 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.810399 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.810796 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.811356 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.814274 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.814816 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.820910 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h48f\" (UniqueName: \"kubernetes.io/projected/49ccd723-5c1a-4763-9eb4-5aed7651bad5-kube-api-access-5h48f\") pod \"nova-edpm-deployment-openstack-edpm-ipam-nx4nm\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:31 crc kubenswrapper[4563]: I1124 09:35:31.996945 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:35:32 crc kubenswrapper[4563]: I1124 09:35:32.474832 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm"] Nov 24 09:35:32 crc kubenswrapper[4563]: I1124 09:35:32.480001 4563 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 09:35:32 crc kubenswrapper[4563]: I1124 09:35:32.609655 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" event={"ID":"49ccd723-5c1a-4763-9eb4-5aed7651bad5","Type":"ContainerStarted","Data":"f235b326fcf7d0d504682cde52d9b73b4da5ec153e9b4382a155e510327fda7a"} Nov 24 09:35:33 crc kubenswrapper[4563]: I1124 09:35:33.617845 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" event={"ID":"49ccd723-5c1a-4763-9eb4-5aed7651bad5","Type":"ContainerStarted","Data":"2536250d5a1666602a6bbc9180be31a14f977e46c6cbce610473717df267d95e"} Nov 24 09:35:33 crc kubenswrapper[4563]: I1124 09:35:33.642926 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" podStartSLOduration=2.170767429 podStartE2EDuration="2.642915069s" podCreationTimestamp="2025-11-24 09:35:31 +0000 UTC" firstStartedPulling="2025-11-24 09:35:32.479761765 +0000 UTC m=+1909.738739213" lastFinishedPulling="2025-11-24 09:35:32.951909406 +0000 UTC m=+1910.210886853" observedRunningTime="2025-11-24 09:35:33.634603794 +0000 UTC m=+1910.893581242" watchObservedRunningTime="2025-11-24 09:35:33.642915069 +0000 UTC m=+1910.901892506" Nov 24 09:35:41 crc kubenswrapper[4563]: I1124 09:35:41.055220 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:35:41 crc kubenswrapper[4563]: E1124 09:35:41.055962 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:35:55 crc kubenswrapper[4563]: I1124 09:35:55.054758 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:35:55 crc kubenswrapper[4563]: E1124 09:35:55.055282 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:36:08 crc kubenswrapper[4563]: I1124 09:36:08.055112 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:36:08 crc kubenswrapper[4563]: E1124 09:36:08.055690 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:36:20 crc kubenswrapper[4563]: I1124 09:36:20.054425 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:36:20 crc kubenswrapper[4563]: E1124 09:36:20.055212 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:36:33 crc kubenswrapper[4563]: I1124 09:36:33.059931 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:36:33 crc kubenswrapper[4563]: E1124 09:36:33.060654 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:36:45 crc kubenswrapper[4563]: I1124 09:36:45.053975 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:36:46 crc kubenswrapper[4563]: I1124 09:36:46.080295 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"aacc3722af1959c6fbbc5ddeea8e84512b7ce34c8a0eca9a4336a8f163950d41"} Nov 24 09:37:23 crc kubenswrapper[4563]: I1124 09:37:23.325175 4563 generic.go:334] "Generic (PLEG): container finished" podID="49ccd723-5c1a-4763-9eb4-5aed7651bad5" containerID="2536250d5a1666602a6bbc9180be31a14f977e46c6cbce610473717df267d95e" exitCode=0 Nov 24 09:37:23 crc kubenswrapper[4563]: I1124 09:37:23.325337 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" event={"ID":"49ccd723-5c1a-4763-9eb4-5aed7651bad5","Type":"ContainerDied","Data":"2536250d5a1666602a6bbc9180be31a14f977e46c6cbce610473717df267d95e"} Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.645777 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.816165 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-inventory\") pod \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.816235 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h48f\" (UniqueName: \"kubernetes.io/projected/49ccd723-5c1a-4763-9eb4-5aed7651bad5-kube-api-access-5h48f\") pod \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.816263 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-cell1-compute-config-0\") pod \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.816289 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-extra-config-0\") pod \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.816351 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-migration-ssh-key-1\") pod \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.816375 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-migration-ssh-key-0\") pod \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.816403 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-combined-ca-bundle\") pod \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.816423 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-cell1-compute-config-1\") pod \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.816451 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-ssh-key\") pod \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\" (UID: \"49ccd723-5c1a-4763-9eb4-5aed7651bad5\") " Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.822768 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ccd723-5c1a-4763-9eb4-5aed7651bad5-kube-api-access-5h48f" (OuterVolumeSpecName: "kube-api-access-5h48f") pod "49ccd723-5c1a-4763-9eb4-5aed7651bad5" (UID: "49ccd723-5c1a-4763-9eb4-5aed7651bad5"). InnerVolumeSpecName "kube-api-access-5h48f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.823013 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "49ccd723-5c1a-4763-9eb4-5aed7651bad5" (UID: "49ccd723-5c1a-4763-9eb4-5aed7651bad5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.841248 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "49ccd723-5c1a-4763-9eb4-5aed7651bad5" (UID: "49ccd723-5c1a-4763-9eb4-5aed7651bad5"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.841431 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "49ccd723-5c1a-4763-9eb4-5aed7651bad5" (UID: "49ccd723-5c1a-4763-9eb4-5aed7651bad5"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.841577 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "49ccd723-5c1a-4763-9eb4-5aed7651bad5" (UID: "49ccd723-5c1a-4763-9eb4-5aed7651bad5"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.841612 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "49ccd723-5c1a-4763-9eb4-5aed7651bad5" (UID: "49ccd723-5c1a-4763-9eb4-5aed7651bad5"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.842622 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "49ccd723-5c1a-4763-9eb4-5aed7651bad5" (UID: "49ccd723-5c1a-4763-9eb4-5aed7651bad5"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.842768 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "49ccd723-5c1a-4763-9eb4-5aed7651bad5" (UID: "49ccd723-5c1a-4763-9eb4-5aed7651bad5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.856949 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-inventory" (OuterVolumeSpecName: "inventory") pod "49ccd723-5c1a-4763-9eb4-5aed7651bad5" (UID: "49ccd723-5c1a-4763-9eb4-5aed7651bad5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.917557 4563 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.917578 4563 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.917587 4563 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.917595 4563 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.917603 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.917612 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.917620 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h48f\" (UniqueName: \"kubernetes.io/projected/49ccd723-5c1a-4763-9eb4-5aed7651bad5-kube-api-access-5h48f\") on node \"crc\" DevicePath \"\"" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.917627 4563 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:37:24 crc kubenswrapper[4563]: I1124 09:37:24.917634 4563 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/49ccd723-5c1a-4763-9eb4-5aed7651bad5-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.339956 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" event={"ID":"49ccd723-5c1a-4763-9eb4-5aed7651bad5","Type":"ContainerDied","Data":"f235b326fcf7d0d504682cde52d9b73b4da5ec153e9b4382a155e510327fda7a"} Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.339991 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f235b326fcf7d0d504682cde52d9b73b4da5ec153e9b4382a155e510327fda7a" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.340020 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-nx4nm" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.401992 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z"] Nov 24 09:37:25 crc kubenswrapper[4563]: E1124 09:37:25.402611 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ccd723-5c1a-4763-9eb4-5aed7651bad5" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.402754 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ccd723-5c1a-4763-9eb4-5aed7651bad5" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.403090 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ccd723-5c1a-4763-9eb4-5aed7651bad5" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.403736 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.405299 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.405476 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.405532 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.405748 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.406987 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jd9jk" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.409068 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z"] Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.526252 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.526340 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.526366 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.526400 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.526415 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.526441 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.526468 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhrlg\" (UniqueName: \"kubernetes.io/projected/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-kube-api-access-vhrlg\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.627886 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.627943 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.627961 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.627987 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.628017 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhrlg\" (UniqueName: \"kubernetes.io/projected/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-kube-api-access-vhrlg\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.628050 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.628118 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.632373 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.632434 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.632543 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.632670 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.633150 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.633753 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.640880 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhrlg\" (UniqueName: \"kubernetes.io/projected/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-kube-api-access-vhrlg\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:25 crc kubenswrapper[4563]: I1124 09:37:25.725878 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:37:26 crc kubenswrapper[4563]: I1124 09:37:26.175748 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z"] Nov 24 09:37:26 crc kubenswrapper[4563]: I1124 09:37:26.346199 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" event={"ID":"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca","Type":"ContainerStarted","Data":"f8d2913a4289ff0c4f384b7466b3d042f3e99fff80a5c508d8bae764ec0b40dd"} Nov 24 09:37:27 crc kubenswrapper[4563]: I1124 09:37:27.354459 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" event={"ID":"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca","Type":"ContainerStarted","Data":"7443f5cd48d201697ca6908f2035afc5839c124dc0eba7b07b956b2c70c56c47"} Nov 24 09:37:27 crc kubenswrapper[4563]: I1124 09:37:27.376384 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" podStartSLOduration=1.867643142 podStartE2EDuration="2.376357085s" podCreationTimestamp="2025-11-24 09:37:25 +0000 UTC" firstStartedPulling="2025-11-24 09:37:26.180333908 +0000 UTC m=+2023.439311356" lastFinishedPulling="2025-11-24 09:37:26.689047851 +0000 UTC m=+2023.948025299" observedRunningTime="2025-11-24 09:37:27.369284466 +0000 UTC m=+2024.628261914" watchObservedRunningTime="2025-11-24 09:37:27.376357085 +0000 UTC m=+2024.635334532" Nov 24 09:38:57 crc kubenswrapper[4563]: I1124 09:38:57.975740 4563 generic.go:334] "Generic (PLEG): container finished" podID="0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca" containerID="7443f5cd48d201697ca6908f2035afc5839c124dc0eba7b07b956b2c70c56c47" exitCode=0 Nov 24 09:38:57 crc kubenswrapper[4563]: I1124 09:38:57.975823 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" event={"ID":"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca","Type":"ContainerDied","Data":"7443f5cd48d201697ca6908f2035afc5839c124dc0eba7b07b956b2c70c56c47"} Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.267410 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.311329 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-1\") pod \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.311403 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-0\") pod \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.311441 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-inventory\") pod \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.311470 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-telemetry-combined-ca-bundle\") pod \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.311575 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ssh-key\") pod \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.311728 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhrlg\" (UniqueName: \"kubernetes.io/projected/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-kube-api-access-vhrlg\") pod \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.311752 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-2\") pod \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\" (UID: \"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca\") " Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.318814 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-kube-api-access-vhrlg" (OuterVolumeSpecName: "kube-api-access-vhrlg") pod "0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca" (UID: "0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca"). InnerVolumeSpecName "kube-api-access-vhrlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.319553 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca" (UID: "0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.335626 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca" (UID: "0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.336175 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca" (UID: "0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.337107 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-inventory" (OuterVolumeSpecName: "inventory") pod "0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca" (UID: "0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.339040 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca" (UID: "0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.342066 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca" (UID: "0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.413952 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.413976 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhrlg\" (UniqueName: \"kubernetes.io/projected/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-kube-api-access-vhrlg\") on node \"crc\" DevicePath \"\"" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.413988 4563 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.413997 4563 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.414007 4563 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.414015 4563 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-inventory\") on node \"crc\" DevicePath \"\"" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.414025 4563 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.991705 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" event={"ID":"0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca","Type":"ContainerDied","Data":"f8d2913a4289ff0c4f384b7466b3d042f3e99fff80a5c508d8bae764ec0b40dd"} Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.991747 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8d2913a4289ff0c4f384b7466b3d042f3e99fff80a5c508d8bae764ec0b40dd" Nov 24 09:38:59 crc kubenswrapper[4563]: I1124 09:38:59.991769 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z" Nov 24 09:39:08 crc kubenswrapper[4563]: I1124 09:39:08.987837 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:39:08 crc kubenswrapper[4563]: I1124 09:39:08.988425 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:39:38 crc kubenswrapper[4563]: I1124 09:39:38.987132 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:39:38 crc kubenswrapper[4563]: I1124 09:39:38.987793 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.503252 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Nov 24 09:39:40 crc kubenswrapper[4563]: E1124 09:39:40.503567 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.503581 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.503752 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.504281 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.505625 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.505928 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.506620 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.506688 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nx56k" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.511987 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.572803 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.572924 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d15e06ff-83ac-44e9-aebe-9756628722e6-config-data\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.573094 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d15e06ff-83ac-44e9-aebe-9756628722e6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.674991 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.675049 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d15e06ff-83ac-44e9-aebe-9756628722e6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.675089 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.675132 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d15e06ff-83ac-44e9-aebe-9756628722e6-config-data\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.675147 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.675193 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px7fv\" (UniqueName: \"kubernetes.io/projected/d15e06ff-83ac-44e9-aebe-9756628722e6-kube-api-access-px7fv\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.675312 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.675337 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d15e06ff-83ac-44e9-aebe-9756628722e6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.675386 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d15e06ff-83ac-44e9-aebe-9756628722e6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.676227 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d15e06ff-83ac-44e9-aebe-9756628722e6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.676369 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d15e06ff-83ac-44e9-aebe-9756628722e6-config-data\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.682397 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.776704 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.776746 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d15e06ff-83ac-44e9-aebe-9756628722e6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.776812 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d15e06ff-83ac-44e9-aebe-9756628722e6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.776838 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.776868 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.776892 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px7fv\" (UniqueName: \"kubernetes.io/projected/d15e06ff-83ac-44e9-aebe-9756628722e6-kube-api-access-px7fv\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.777225 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.777239 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d15e06ff-83ac-44e9-aebe-9756628722e6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.777376 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d15e06ff-83ac-44e9-aebe-9756628722e6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.780958 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.781796 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.791114 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px7fv\" (UniqueName: \"kubernetes.io/projected/d15e06ff-83ac-44e9-aebe-9756628722e6-kube-api-access-px7fv\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.797295 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " pod="openstack/tempest-tests-tempest" Nov 24 09:39:40 crc kubenswrapper[4563]: I1124 09:39:40.819108 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 24 09:39:41 crc kubenswrapper[4563]: I1124 09:39:41.184599 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 24 09:39:41 crc kubenswrapper[4563]: I1124 09:39:41.292516 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d15e06ff-83ac-44e9-aebe-9756628722e6","Type":"ContainerStarted","Data":"48d83a625135a00ab5e632c2f8ebb9059ff28833e1cbafce66ce33efd48207fa"} Nov 24 09:40:02 crc kubenswrapper[4563]: E1124 09:40:02.360052 4563 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Nov 24 09:40:02 crc kubenswrapper[4563]: E1124 09:40:02.360842 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-px7fv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(d15e06ff-83ac-44e9-aebe-9756628722e6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 24 09:40:02 crc kubenswrapper[4563]: E1124 09:40:02.362003 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="d15e06ff-83ac-44e9-aebe-9756628722e6" Nov 24 09:40:02 crc kubenswrapper[4563]: E1124 09:40:02.481696 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="d15e06ff-83ac-44e9-aebe-9756628722e6" Nov 24 09:40:08 crc kubenswrapper[4563]: I1124 09:40:08.987888 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:40:08 crc kubenswrapper[4563]: I1124 09:40:08.989026 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:40:08 crc kubenswrapper[4563]: I1124 09:40:08.989096 4563 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:40:08 crc kubenswrapper[4563]: I1124 09:40:08.989976 4563 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aacc3722af1959c6fbbc5ddeea8e84512b7ce34c8a0eca9a4336a8f163950d41"} pod="openshift-machine-config-operator/machine-config-daemon-stlxr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 09:40:08 crc kubenswrapper[4563]: I1124 09:40:08.990048 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" containerID="cri-o://aacc3722af1959c6fbbc5ddeea8e84512b7ce34c8a0eca9a4336a8f163950d41" gracePeriod=600 Nov 24 09:40:09 crc kubenswrapper[4563]: I1124 09:40:09.546696 4563 generic.go:334] "Generic (PLEG): container finished" podID="3b2bfe55-8989-49b3-bb61-e28189447627" containerID="aacc3722af1959c6fbbc5ddeea8e84512b7ce34c8a0eca9a4336a8f163950d41" exitCode=0 Nov 24 09:40:09 crc kubenswrapper[4563]: I1124 09:40:09.546762 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerDied","Data":"aacc3722af1959c6fbbc5ddeea8e84512b7ce34c8a0eca9a4336a8f163950d41"} Nov 24 09:40:09 crc kubenswrapper[4563]: I1124 09:40:09.547055 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610"} Nov 24 09:40:09 crc kubenswrapper[4563]: I1124 09:40:09.547083 4563 scope.go:117] "RemoveContainer" containerID="bf4e38c59af80ddc10387af4646b91d436fec28928eb37662cf3731280e4a512" Nov 24 09:40:17 crc kubenswrapper[4563]: I1124 09:40:17.131912 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qbdbt"] Nov 24 09:40:17 crc kubenswrapper[4563]: I1124 09:40:17.138850 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:17 crc kubenswrapper[4563]: I1124 09:40:17.149442 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qbdbt"] Nov 24 09:40:17 crc kubenswrapper[4563]: I1124 09:40:17.289777 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q598j\" (UniqueName: \"kubernetes.io/projected/840cef28-b4c1-40eb-baa7-a8487b3eae9c-kube-api-access-q598j\") pod \"redhat-marketplace-qbdbt\" (UID: \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\") " pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:17 crc kubenswrapper[4563]: I1124 09:40:17.289854 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840cef28-b4c1-40eb-baa7-a8487b3eae9c-catalog-content\") pod \"redhat-marketplace-qbdbt\" (UID: \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\") " pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:17 crc kubenswrapper[4563]: I1124 09:40:17.290082 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840cef28-b4c1-40eb-baa7-a8487b3eae9c-utilities\") pod \"redhat-marketplace-qbdbt\" (UID: \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\") " pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:17 crc kubenswrapper[4563]: I1124 09:40:17.392538 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840cef28-b4c1-40eb-baa7-a8487b3eae9c-utilities\") pod \"redhat-marketplace-qbdbt\" (UID: \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\") " pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:17 crc kubenswrapper[4563]: I1124 09:40:17.392819 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840cef28-b4c1-40eb-baa7-a8487b3eae9c-catalog-content\") pod \"redhat-marketplace-qbdbt\" (UID: \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\") " pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:17 crc kubenswrapper[4563]: I1124 09:40:17.392851 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q598j\" (UniqueName: \"kubernetes.io/projected/840cef28-b4c1-40eb-baa7-a8487b3eae9c-kube-api-access-q598j\") pod \"redhat-marketplace-qbdbt\" (UID: \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\") " pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:17 crc kubenswrapper[4563]: I1124 09:40:17.393189 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840cef28-b4c1-40eb-baa7-a8487b3eae9c-utilities\") pod \"redhat-marketplace-qbdbt\" (UID: \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\") " pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:17 crc kubenswrapper[4563]: I1124 09:40:17.393233 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840cef28-b4c1-40eb-baa7-a8487b3eae9c-catalog-content\") pod \"redhat-marketplace-qbdbt\" (UID: \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\") " pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:17 crc kubenswrapper[4563]: I1124 09:40:17.414741 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q598j\" (UniqueName: \"kubernetes.io/projected/840cef28-b4c1-40eb-baa7-a8487b3eae9c-kube-api-access-q598j\") pod \"redhat-marketplace-qbdbt\" (UID: \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\") " pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:17 crc kubenswrapper[4563]: I1124 09:40:17.460338 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:17 crc kubenswrapper[4563]: I1124 09:40:17.847705 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qbdbt"] Nov 24 09:40:18 crc kubenswrapper[4563]: E1124 09:40:18.559954 4563 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/2e/2e5864f03aaafb653d069ef6599bedee7a2fc72f7e16e452b97b6d1905ea6339?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251124%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251124T094017Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=375cb5e66b0c9d6708d545ecfe3fb5ab3cdab88756c64fd26787ae3997ac5c5b®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-tempest-all&akamai_signature=exp=1763978117~hmac=e4ab64135b27cf2b3f1ad8a89d18d824c59a071f33546fd6fe3d8ab2b107f926\": remote error: tls: internal error" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Nov 24 09:40:18 crc kubenswrapper[4563]: E1124 09:40:18.560343 4563 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-px7fv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(d15e06ff-83ac-44e9-aebe-9756628722e6): ErrImagePull: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/2e/2e5864f03aaafb653d069ef6599bedee7a2fc72f7e16e452b97b6d1905ea6339?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251124%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251124T094017Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=375cb5e66b0c9d6708d545ecfe3fb5ab3cdab88756c64fd26787ae3997ac5c5b®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-tempest-all&akamai_signature=exp=1763978117~hmac=e4ab64135b27cf2b3f1ad8a89d18d824c59a071f33546fd6fe3d8ab2b107f926\": remote error: tls: internal error" logger="UnhandledError" Nov 24 09:40:18 crc kubenswrapper[4563]: E1124 09:40:18.561544 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/2e/2e5864f03aaafb653d069ef6599bedee7a2fc72f7e16e452b97b6d1905ea6339?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20251124%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20251124T094017Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=375cb5e66b0c9d6708d545ecfe3fb5ab3cdab88756c64fd26787ae3997ac5c5b®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-tempest-all&akamai_signature=exp=1763978117~hmac=e4ab64135b27cf2b3f1ad8a89d18d824c59a071f33546fd6fe3d8ab2b107f926\\\": remote error: tls: internal error\"" pod="openstack/tempest-tests-tempest" podUID="d15e06ff-83ac-44e9-aebe-9756628722e6" Nov 24 09:40:18 crc kubenswrapper[4563]: I1124 09:40:18.615141 4563 generic.go:334] "Generic (PLEG): container finished" podID="840cef28-b4c1-40eb-baa7-a8487b3eae9c" containerID="6a3cf6a7b67355eacb177e6d65cd3d452133f36270331336b5b364a95c5d542b" exitCode=0 Nov 24 09:40:18 crc kubenswrapper[4563]: I1124 09:40:18.615182 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qbdbt" event={"ID":"840cef28-b4c1-40eb-baa7-a8487b3eae9c","Type":"ContainerDied","Data":"6a3cf6a7b67355eacb177e6d65cd3d452133f36270331336b5b364a95c5d542b"} Nov 24 09:40:18 crc kubenswrapper[4563]: I1124 09:40:18.615210 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qbdbt" event={"ID":"840cef28-b4c1-40eb-baa7-a8487b3eae9c","Type":"ContainerStarted","Data":"5a42bbb806014dcf1b2ee7eda541b22b32f27d38046ade08768985ebfe14b044"} Nov 24 09:40:19 crc kubenswrapper[4563]: I1124 09:40:19.623348 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qbdbt" event={"ID":"840cef28-b4c1-40eb-baa7-a8487b3eae9c","Type":"ContainerStarted","Data":"e62aeed2c639d7eb20c1721e83430f9076e7669ca4cb044373a7df7e5b89c2ed"} Nov 24 09:40:20 crc kubenswrapper[4563]: I1124 09:40:20.642109 4563 generic.go:334] "Generic (PLEG): container finished" podID="840cef28-b4c1-40eb-baa7-a8487b3eae9c" containerID="e62aeed2c639d7eb20c1721e83430f9076e7669ca4cb044373a7df7e5b89c2ed" exitCode=0 Nov 24 09:40:20 crc kubenswrapper[4563]: I1124 09:40:20.642153 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qbdbt" event={"ID":"840cef28-b4c1-40eb-baa7-a8487b3eae9c","Type":"ContainerDied","Data":"e62aeed2c639d7eb20c1721e83430f9076e7669ca4cb044373a7df7e5b89c2ed"} Nov 24 09:40:21 crc kubenswrapper[4563]: I1124 09:40:21.652829 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qbdbt" event={"ID":"840cef28-b4c1-40eb-baa7-a8487b3eae9c","Type":"ContainerStarted","Data":"c25bc2c7554d9b312d18f642a78b0e6dea88369c13b98f2b753dfde49ad6e603"} Nov 24 09:40:21 crc kubenswrapper[4563]: I1124 09:40:21.673877 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qbdbt" podStartSLOduration=2.180915011 podStartE2EDuration="4.673859781s" podCreationTimestamp="2025-11-24 09:40:17 +0000 UTC" firstStartedPulling="2025-11-24 09:40:18.617328207 +0000 UTC m=+2195.876305655" lastFinishedPulling="2025-11-24 09:40:21.110272978 +0000 UTC m=+2198.369250425" observedRunningTime="2025-11-24 09:40:21.669981787 +0000 UTC m=+2198.928959234" watchObservedRunningTime="2025-11-24 09:40:21.673859781 +0000 UTC m=+2198.932837228" Nov 24 09:40:27 crc kubenswrapper[4563]: I1124 09:40:27.460523 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:27 crc kubenswrapper[4563]: I1124 09:40:27.461054 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:27 crc kubenswrapper[4563]: I1124 09:40:27.494305 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:27 crc kubenswrapper[4563]: I1124 09:40:27.734671 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:27 crc kubenswrapper[4563]: I1124 09:40:27.776967 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qbdbt"] Nov 24 09:40:29 crc kubenswrapper[4563]: I1124 09:40:29.721069 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qbdbt" podUID="840cef28-b4c1-40eb-baa7-a8487b3eae9c" containerName="registry-server" containerID="cri-o://c25bc2c7554d9b312d18f642a78b0e6dea88369c13b98f2b753dfde49ad6e603" gracePeriod=2 Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.113684 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.228522 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q598j\" (UniqueName: \"kubernetes.io/projected/840cef28-b4c1-40eb-baa7-a8487b3eae9c-kube-api-access-q598j\") pod \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\" (UID: \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\") " Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.229022 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840cef28-b4c1-40eb-baa7-a8487b3eae9c-catalog-content\") pod \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\" (UID: \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\") " Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.229373 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840cef28-b4c1-40eb-baa7-a8487b3eae9c-utilities\") pod \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\" (UID: \"840cef28-b4c1-40eb-baa7-a8487b3eae9c\") " Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.230088 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/840cef28-b4c1-40eb-baa7-a8487b3eae9c-utilities" (OuterVolumeSpecName: "utilities") pod "840cef28-b4c1-40eb-baa7-a8487b3eae9c" (UID: "840cef28-b4c1-40eb-baa7-a8487b3eae9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.234277 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/840cef28-b4c1-40eb-baa7-a8487b3eae9c-kube-api-access-q598j" (OuterVolumeSpecName: "kube-api-access-q598j") pod "840cef28-b4c1-40eb-baa7-a8487b3eae9c" (UID: "840cef28-b4c1-40eb-baa7-a8487b3eae9c"). InnerVolumeSpecName "kube-api-access-q598j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.242005 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/840cef28-b4c1-40eb-baa7-a8487b3eae9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "840cef28-b4c1-40eb-baa7-a8487b3eae9c" (UID: "840cef28-b4c1-40eb-baa7-a8487b3eae9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.332797 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840cef28-b4c1-40eb-baa7-a8487b3eae9c-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.332851 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q598j\" (UniqueName: \"kubernetes.io/projected/840cef28-b4c1-40eb-baa7-a8487b3eae9c-kube-api-access-q598j\") on node \"crc\" DevicePath \"\"" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.332866 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840cef28-b4c1-40eb-baa7-a8487b3eae9c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.733836 4563 generic.go:334] "Generic (PLEG): container finished" podID="840cef28-b4c1-40eb-baa7-a8487b3eae9c" containerID="c25bc2c7554d9b312d18f642a78b0e6dea88369c13b98f2b753dfde49ad6e603" exitCode=0 Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.733885 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qbdbt" event={"ID":"840cef28-b4c1-40eb-baa7-a8487b3eae9c","Type":"ContainerDied","Data":"c25bc2c7554d9b312d18f642a78b0e6dea88369c13b98f2b753dfde49ad6e603"} Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.733922 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qbdbt" event={"ID":"840cef28-b4c1-40eb-baa7-a8487b3eae9c","Type":"ContainerDied","Data":"5a42bbb806014dcf1b2ee7eda541b22b32f27d38046ade08768985ebfe14b044"} Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.733944 4563 scope.go:117] "RemoveContainer" containerID="c25bc2c7554d9b312d18f642a78b0e6dea88369c13b98f2b753dfde49ad6e603" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.734077 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qbdbt" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.777228 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qbdbt"] Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.782801 4563 scope.go:117] "RemoveContainer" containerID="e62aeed2c639d7eb20c1721e83430f9076e7669ca4cb044373a7df7e5b89c2ed" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.793192 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qbdbt"] Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.878285 4563 scope.go:117] "RemoveContainer" containerID="6a3cf6a7b67355eacb177e6d65cd3d452133f36270331336b5b364a95c5d542b" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.900723 4563 scope.go:117] "RemoveContainer" containerID="c25bc2c7554d9b312d18f642a78b0e6dea88369c13b98f2b753dfde49ad6e603" Nov 24 09:40:30 crc kubenswrapper[4563]: E1124 09:40:30.901314 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c25bc2c7554d9b312d18f642a78b0e6dea88369c13b98f2b753dfde49ad6e603\": container with ID starting with c25bc2c7554d9b312d18f642a78b0e6dea88369c13b98f2b753dfde49ad6e603 not found: ID does not exist" containerID="c25bc2c7554d9b312d18f642a78b0e6dea88369c13b98f2b753dfde49ad6e603" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.901344 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c25bc2c7554d9b312d18f642a78b0e6dea88369c13b98f2b753dfde49ad6e603"} err="failed to get container status \"c25bc2c7554d9b312d18f642a78b0e6dea88369c13b98f2b753dfde49ad6e603\": rpc error: code = NotFound desc = could not find container \"c25bc2c7554d9b312d18f642a78b0e6dea88369c13b98f2b753dfde49ad6e603\": container with ID starting with c25bc2c7554d9b312d18f642a78b0e6dea88369c13b98f2b753dfde49ad6e603 not found: ID does not exist" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.901365 4563 scope.go:117] "RemoveContainer" containerID="e62aeed2c639d7eb20c1721e83430f9076e7669ca4cb044373a7df7e5b89c2ed" Nov 24 09:40:30 crc kubenswrapper[4563]: E1124 09:40:30.901806 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e62aeed2c639d7eb20c1721e83430f9076e7669ca4cb044373a7df7e5b89c2ed\": container with ID starting with e62aeed2c639d7eb20c1721e83430f9076e7669ca4cb044373a7df7e5b89c2ed not found: ID does not exist" containerID="e62aeed2c639d7eb20c1721e83430f9076e7669ca4cb044373a7df7e5b89c2ed" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.901826 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e62aeed2c639d7eb20c1721e83430f9076e7669ca4cb044373a7df7e5b89c2ed"} err="failed to get container status \"e62aeed2c639d7eb20c1721e83430f9076e7669ca4cb044373a7df7e5b89c2ed\": rpc error: code = NotFound desc = could not find container \"e62aeed2c639d7eb20c1721e83430f9076e7669ca4cb044373a7df7e5b89c2ed\": container with ID starting with e62aeed2c639d7eb20c1721e83430f9076e7669ca4cb044373a7df7e5b89c2ed not found: ID does not exist" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.901842 4563 scope.go:117] "RemoveContainer" containerID="6a3cf6a7b67355eacb177e6d65cd3d452133f36270331336b5b364a95c5d542b" Nov 24 09:40:30 crc kubenswrapper[4563]: E1124 09:40:30.902089 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a3cf6a7b67355eacb177e6d65cd3d452133f36270331336b5b364a95c5d542b\": container with ID starting with 6a3cf6a7b67355eacb177e6d65cd3d452133f36270331336b5b364a95c5d542b not found: ID does not exist" containerID="6a3cf6a7b67355eacb177e6d65cd3d452133f36270331336b5b364a95c5d542b" Nov 24 09:40:30 crc kubenswrapper[4563]: I1124 09:40:30.902108 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a3cf6a7b67355eacb177e6d65cd3d452133f36270331336b5b364a95c5d542b"} err="failed to get container status \"6a3cf6a7b67355eacb177e6d65cd3d452133f36270331336b5b364a95c5d542b\": rpc error: code = NotFound desc = could not find container \"6a3cf6a7b67355eacb177e6d65cd3d452133f36270331336b5b364a95c5d542b\": container with ID starting with 6a3cf6a7b67355eacb177e6d65cd3d452133f36270331336b5b364a95c5d542b not found: ID does not exist" Nov 24 09:40:31 crc kubenswrapper[4563]: E1124 09:40:31.055464 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="d15e06ff-83ac-44e9-aebe-9756628722e6" Nov 24 09:40:31 crc kubenswrapper[4563]: I1124 09:40:31.064862 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="840cef28-b4c1-40eb-baa7-a8487b3eae9c" path="/var/lib/kubelet/pods/840cef28-b4c1-40eb-baa7-a8487b3eae9c/volumes" Nov 24 09:40:44 crc kubenswrapper[4563]: I1124 09:40:44.056484 4563 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 09:40:44 crc kubenswrapper[4563]: I1124 09:40:44.614753 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 24 09:40:45 crc kubenswrapper[4563]: I1124 09:40:45.852142 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d15e06ff-83ac-44e9-aebe-9756628722e6","Type":"ContainerStarted","Data":"bff15d7286e8c850efd484a0717502e97e15313eca4f3b4fb2fd32320310b20d"} Nov 24 09:40:45 crc kubenswrapper[4563]: I1124 09:40:45.872581 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.452744959 podStartE2EDuration="1m6.872564456s" podCreationTimestamp="2025-11-24 09:39:39 +0000 UTC" firstStartedPulling="2025-11-24 09:39:41.191075712 +0000 UTC m=+2158.450053160" lastFinishedPulling="2025-11-24 09:40:44.610895211 +0000 UTC m=+2221.869872657" observedRunningTime="2025-11-24 09:40:45.865093775 +0000 UTC m=+2223.124071223" watchObservedRunningTime="2025-11-24 09:40:45.872564456 +0000 UTC m=+2223.131541903" Nov 24 09:42:04 crc kubenswrapper[4563]: I1124 09:42:04.832456 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4ls8d"] Nov 24 09:42:04 crc kubenswrapper[4563]: E1124 09:42:04.833326 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840cef28-b4c1-40eb-baa7-a8487b3eae9c" containerName="registry-server" Nov 24 09:42:04 crc kubenswrapper[4563]: I1124 09:42:04.833340 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="840cef28-b4c1-40eb-baa7-a8487b3eae9c" containerName="registry-server" Nov 24 09:42:04 crc kubenswrapper[4563]: E1124 09:42:04.833366 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840cef28-b4c1-40eb-baa7-a8487b3eae9c" containerName="extract-utilities" Nov 24 09:42:04 crc kubenswrapper[4563]: I1124 09:42:04.833372 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="840cef28-b4c1-40eb-baa7-a8487b3eae9c" containerName="extract-utilities" Nov 24 09:42:04 crc kubenswrapper[4563]: E1124 09:42:04.833393 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840cef28-b4c1-40eb-baa7-a8487b3eae9c" containerName="extract-content" Nov 24 09:42:04 crc kubenswrapper[4563]: I1124 09:42:04.833398 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="840cef28-b4c1-40eb-baa7-a8487b3eae9c" containerName="extract-content" Nov 24 09:42:04 crc kubenswrapper[4563]: I1124 09:42:04.833570 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="840cef28-b4c1-40eb-baa7-a8487b3eae9c" containerName="registry-server" Nov 24 09:42:04 crc kubenswrapper[4563]: I1124 09:42:04.834977 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:04 crc kubenswrapper[4563]: I1124 09:42:04.844482 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4ls8d"] Nov 24 09:42:04 crc kubenswrapper[4563]: I1124 09:42:04.904689 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f2ab7de-6a8e-4f44-8546-385fae9849fa-catalog-content\") pod \"certified-operators-4ls8d\" (UID: \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\") " pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:04 crc kubenswrapper[4563]: I1124 09:42:04.904782 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f2ab7de-6a8e-4f44-8546-385fae9849fa-utilities\") pod \"certified-operators-4ls8d\" (UID: \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\") " pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:04 crc kubenswrapper[4563]: I1124 09:42:04.904917 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx2rw\" (UniqueName: \"kubernetes.io/projected/1f2ab7de-6a8e-4f44-8546-385fae9849fa-kube-api-access-bx2rw\") pod \"certified-operators-4ls8d\" (UID: \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\") " pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:05 crc kubenswrapper[4563]: I1124 09:42:05.006716 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx2rw\" (UniqueName: \"kubernetes.io/projected/1f2ab7de-6a8e-4f44-8546-385fae9849fa-kube-api-access-bx2rw\") pod \"certified-operators-4ls8d\" (UID: \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\") " pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:05 crc kubenswrapper[4563]: I1124 09:42:05.006906 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f2ab7de-6a8e-4f44-8546-385fae9849fa-catalog-content\") pod \"certified-operators-4ls8d\" (UID: \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\") " pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:05 crc kubenswrapper[4563]: I1124 09:42:05.006988 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f2ab7de-6a8e-4f44-8546-385fae9849fa-utilities\") pod \"certified-operators-4ls8d\" (UID: \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\") " pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:05 crc kubenswrapper[4563]: I1124 09:42:05.007415 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f2ab7de-6a8e-4f44-8546-385fae9849fa-utilities\") pod \"certified-operators-4ls8d\" (UID: \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\") " pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:05 crc kubenswrapper[4563]: I1124 09:42:05.007444 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f2ab7de-6a8e-4f44-8546-385fae9849fa-catalog-content\") pod \"certified-operators-4ls8d\" (UID: \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\") " pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:05 crc kubenswrapper[4563]: I1124 09:42:05.033786 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx2rw\" (UniqueName: \"kubernetes.io/projected/1f2ab7de-6a8e-4f44-8546-385fae9849fa-kube-api-access-bx2rw\") pod \"certified-operators-4ls8d\" (UID: \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\") " pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:05 crc kubenswrapper[4563]: I1124 09:42:05.149735 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:05 crc kubenswrapper[4563]: I1124 09:42:05.623009 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4ls8d"] Nov 24 09:42:06 crc kubenswrapper[4563]: I1124 09:42:06.462949 4563 generic.go:334] "Generic (PLEG): container finished" podID="1f2ab7de-6a8e-4f44-8546-385fae9849fa" containerID="b0b6f96ec13dca714ac7a7f6e95a19ebf28340a47333d11bc126b34d04e990c6" exitCode=0 Nov 24 09:42:06 crc kubenswrapper[4563]: I1124 09:42:06.463000 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4ls8d" event={"ID":"1f2ab7de-6a8e-4f44-8546-385fae9849fa","Type":"ContainerDied","Data":"b0b6f96ec13dca714ac7a7f6e95a19ebf28340a47333d11bc126b34d04e990c6"} Nov 24 09:42:06 crc kubenswrapper[4563]: I1124 09:42:06.463127 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4ls8d" event={"ID":"1f2ab7de-6a8e-4f44-8546-385fae9849fa","Type":"ContainerStarted","Data":"7ce19febe810ffe5fba4bb8a486c45c154055a6bdb0f3c50a5db6f56ca5c1307"} Nov 24 09:42:07 crc kubenswrapper[4563]: I1124 09:42:07.473488 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4ls8d" event={"ID":"1f2ab7de-6a8e-4f44-8546-385fae9849fa","Type":"ContainerStarted","Data":"81e8362dc23c86932ad7fb3986674ca4eac1daf454b8f510fea8d25783853a23"} Nov 24 09:42:08 crc kubenswrapper[4563]: I1124 09:42:08.482224 4563 generic.go:334] "Generic (PLEG): container finished" podID="1f2ab7de-6a8e-4f44-8546-385fae9849fa" containerID="81e8362dc23c86932ad7fb3986674ca4eac1daf454b8f510fea8d25783853a23" exitCode=0 Nov 24 09:42:08 crc kubenswrapper[4563]: I1124 09:42:08.482269 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4ls8d" event={"ID":"1f2ab7de-6a8e-4f44-8546-385fae9849fa","Type":"ContainerDied","Data":"81e8362dc23c86932ad7fb3986674ca4eac1daf454b8f510fea8d25783853a23"} Nov 24 09:42:09 crc kubenswrapper[4563]: I1124 09:42:09.491030 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4ls8d" event={"ID":"1f2ab7de-6a8e-4f44-8546-385fae9849fa","Type":"ContainerStarted","Data":"df4c32b5a6cfd0e6f9107205edd5ee4fb13c2264ee9601eb9b05382724b1fb3e"} Nov 24 09:42:09 crc kubenswrapper[4563]: I1124 09:42:09.507376 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4ls8d" podStartSLOduration=2.992291185 podStartE2EDuration="5.507358167s" podCreationTimestamp="2025-11-24 09:42:04 +0000 UTC" firstStartedPulling="2025-11-24 09:42:06.46483784 +0000 UTC m=+2303.723815287" lastFinishedPulling="2025-11-24 09:42:08.979904821 +0000 UTC m=+2306.238882269" observedRunningTime="2025-11-24 09:42:09.505689681 +0000 UTC m=+2306.764667127" watchObservedRunningTime="2025-11-24 09:42:09.507358167 +0000 UTC m=+2306.766335614" Nov 24 09:42:15 crc kubenswrapper[4563]: I1124 09:42:15.150256 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:15 crc kubenswrapper[4563]: I1124 09:42:15.150580 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:15 crc kubenswrapper[4563]: I1124 09:42:15.182471 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:15 crc kubenswrapper[4563]: I1124 09:42:15.558484 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:15 crc kubenswrapper[4563]: I1124 09:42:15.591388 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4ls8d"] Nov 24 09:42:17 crc kubenswrapper[4563]: I1124 09:42:17.541211 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4ls8d" podUID="1f2ab7de-6a8e-4f44-8546-385fae9849fa" containerName="registry-server" containerID="cri-o://df4c32b5a6cfd0e6f9107205edd5ee4fb13c2264ee9601eb9b05382724b1fb3e" gracePeriod=2 Nov 24 09:42:17 crc kubenswrapper[4563]: I1124 09:42:17.937186 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.121839 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f2ab7de-6a8e-4f44-8546-385fae9849fa-catalog-content\") pod \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\" (UID: \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\") " Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.121927 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx2rw\" (UniqueName: \"kubernetes.io/projected/1f2ab7de-6a8e-4f44-8546-385fae9849fa-kube-api-access-bx2rw\") pod \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\" (UID: \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\") " Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.122001 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f2ab7de-6a8e-4f44-8546-385fae9849fa-utilities\") pod \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\" (UID: \"1f2ab7de-6a8e-4f44-8546-385fae9849fa\") " Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.122909 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f2ab7de-6a8e-4f44-8546-385fae9849fa-utilities" (OuterVolumeSpecName: "utilities") pod "1f2ab7de-6a8e-4f44-8546-385fae9849fa" (UID: "1f2ab7de-6a8e-4f44-8546-385fae9849fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.127286 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f2ab7de-6a8e-4f44-8546-385fae9849fa-kube-api-access-bx2rw" (OuterVolumeSpecName: "kube-api-access-bx2rw") pod "1f2ab7de-6a8e-4f44-8546-385fae9849fa" (UID: "1f2ab7de-6a8e-4f44-8546-385fae9849fa"). InnerVolumeSpecName "kube-api-access-bx2rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.156769 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f2ab7de-6a8e-4f44-8546-385fae9849fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f2ab7de-6a8e-4f44-8546-385fae9849fa" (UID: "1f2ab7de-6a8e-4f44-8546-385fae9849fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.223894 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f2ab7de-6a8e-4f44-8546-385fae9849fa-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.223921 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx2rw\" (UniqueName: \"kubernetes.io/projected/1f2ab7de-6a8e-4f44-8546-385fae9849fa-kube-api-access-bx2rw\") on node \"crc\" DevicePath \"\"" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.223932 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f2ab7de-6a8e-4f44-8546-385fae9849fa-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.550151 4563 generic.go:334] "Generic (PLEG): container finished" podID="1f2ab7de-6a8e-4f44-8546-385fae9849fa" containerID="df4c32b5a6cfd0e6f9107205edd5ee4fb13c2264ee9601eb9b05382724b1fb3e" exitCode=0 Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.550202 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4ls8d" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.550220 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4ls8d" event={"ID":"1f2ab7de-6a8e-4f44-8546-385fae9849fa","Type":"ContainerDied","Data":"df4c32b5a6cfd0e6f9107205edd5ee4fb13c2264ee9601eb9b05382724b1fb3e"} Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.550519 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4ls8d" event={"ID":"1f2ab7de-6a8e-4f44-8546-385fae9849fa","Type":"ContainerDied","Data":"7ce19febe810ffe5fba4bb8a486c45c154055a6bdb0f3c50a5db6f56ca5c1307"} Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.550536 4563 scope.go:117] "RemoveContainer" containerID="df4c32b5a6cfd0e6f9107205edd5ee4fb13c2264ee9601eb9b05382724b1fb3e" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.565980 4563 scope.go:117] "RemoveContainer" containerID="81e8362dc23c86932ad7fb3986674ca4eac1daf454b8f510fea8d25783853a23" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.585176 4563 scope.go:117] "RemoveContainer" containerID="b0b6f96ec13dca714ac7a7f6e95a19ebf28340a47333d11bc126b34d04e990c6" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.585191 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4ls8d"] Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.590860 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4ls8d"] Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.613293 4563 scope.go:117] "RemoveContainer" containerID="df4c32b5a6cfd0e6f9107205edd5ee4fb13c2264ee9601eb9b05382724b1fb3e" Nov 24 09:42:18 crc kubenswrapper[4563]: E1124 09:42:18.613614 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df4c32b5a6cfd0e6f9107205edd5ee4fb13c2264ee9601eb9b05382724b1fb3e\": container with ID starting with df4c32b5a6cfd0e6f9107205edd5ee4fb13c2264ee9601eb9b05382724b1fb3e not found: ID does not exist" containerID="df4c32b5a6cfd0e6f9107205edd5ee4fb13c2264ee9601eb9b05382724b1fb3e" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.613663 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df4c32b5a6cfd0e6f9107205edd5ee4fb13c2264ee9601eb9b05382724b1fb3e"} err="failed to get container status \"df4c32b5a6cfd0e6f9107205edd5ee4fb13c2264ee9601eb9b05382724b1fb3e\": rpc error: code = NotFound desc = could not find container \"df4c32b5a6cfd0e6f9107205edd5ee4fb13c2264ee9601eb9b05382724b1fb3e\": container with ID starting with df4c32b5a6cfd0e6f9107205edd5ee4fb13c2264ee9601eb9b05382724b1fb3e not found: ID does not exist" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.613691 4563 scope.go:117] "RemoveContainer" containerID="81e8362dc23c86932ad7fb3986674ca4eac1daf454b8f510fea8d25783853a23" Nov 24 09:42:18 crc kubenswrapper[4563]: E1124 09:42:18.614021 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81e8362dc23c86932ad7fb3986674ca4eac1daf454b8f510fea8d25783853a23\": container with ID starting with 81e8362dc23c86932ad7fb3986674ca4eac1daf454b8f510fea8d25783853a23 not found: ID does not exist" containerID="81e8362dc23c86932ad7fb3986674ca4eac1daf454b8f510fea8d25783853a23" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.614050 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81e8362dc23c86932ad7fb3986674ca4eac1daf454b8f510fea8d25783853a23"} err="failed to get container status \"81e8362dc23c86932ad7fb3986674ca4eac1daf454b8f510fea8d25783853a23\": rpc error: code = NotFound desc = could not find container \"81e8362dc23c86932ad7fb3986674ca4eac1daf454b8f510fea8d25783853a23\": container with ID starting with 81e8362dc23c86932ad7fb3986674ca4eac1daf454b8f510fea8d25783853a23 not found: ID does not exist" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.614071 4563 scope.go:117] "RemoveContainer" containerID="b0b6f96ec13dca714ac7a7f6e95a19ebf28340a47333d11bc126b34d04e990c6" Nov 24 09:42:18 crc kubenswrapper[4563]: E1124 09:42:18.614375 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0b6f96ec13dca714ac7a7f6e95a19ebf28340a47333d11bc126b34d04e990c6\": container with ID starting with b0b6f96ec13dca714ac7a7f6e95a19ebf28340a47333d11bc126b34d04e990c6 not found: ID does not exist" containerID="b0b6f96ec13dca714ac7a7f6e95a19ebf28340a47333d11bc126b34d04e990c6" Nov 24 09:42:18 crc kubenswrapper[4563]: I1124 09:42:18.614393 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0b6f96ec13dca714ac7a7f6e95a19ebf28340a47333d11bc126b34d04e990c6"} err="failed to get container status \"b0b6f96ec13dca714ac7a7f6e95a19ebf28340a47333d11bc126b34d04e990c6\": rpc error: code = NotFound desc = could not find container \"b0b6f96ec13dca714ac7a7f6e95a19ebf28340a47333d11bc126b34d04e990c6\": container with ID starting with b0b6f96ec13dca714ac7a7f6e95a19ebf28340a47333d11bc126b34d04e990c6 not found: ID does not exist" Nov 24 09:42:19 crc kubenswrapper[4563]: I1124 09:42:19.062960 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f2ab7de-6a8e-4f44-8546-385fae9849fa" path="/var/lib/kubelet/pods/1f2ab7de-6a8e-4f44-8546-385fae9849fa/volumes" Nov 24 09:42:38 crc kubenswrapper[4563]: I1124 09:42:38.987782 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:42:38 crc kubenswrapper[4563]: I1124 09:42:38.988415 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.551017 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d4zdm"] Nov 24 09:43:02 crc kubenswrapper[4563]: E1124 09:43:02.551759 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2ab7de-6a8e-4f44-8546-385fae9849fa" containerName="registry-server" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.551772 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2ab7de-6a8e-4f44-8546-385fae9849fa" containerName="registry-server" Nov 24 09:43:02 crc kubenswrapper[4563]: E1124 09:43:02.551787 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2ab7de-6a8e-4f44-8546-385fae9849fa" containerName="extract-content" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.551793 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2ab7de-6a8e-4f44-8546-385fae9849fa" containerName="extract-content" Nov 24 09:43:02 crc kubenswrapper[4563]: E1124 09:43:02.551814 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2ab7de-6a8e-4f44-8546-385fae9849fa" containerName="extract-utilities" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.551822 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2ab7de-6a8e-4f44-8546-385fae9849fa" containerName="extract-utilities" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.552025 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2ab7de-6a8e-4f44-8546-385fae9849fa" containerName="registry-server" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.553296 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.559912 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d4zdm"] Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.722860 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619ea327-b839-4e7a-8e23-f2a8ed6d4014-catalog-content\") pod \"community-operators-d4zdm\" (UID: \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\") " pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.722930 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619ea327-b839-4e7a-8e23-f2a8ed6d4014-utilities\") pod \"community-operators-d4zdm\" (UID: \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\") " pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.723233 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm9jm\" (UniqueName: \"kubernetes.io/projected/619ea327-b839-4e7a-8e23-f2a8ed6d4014-kube-api-access-gm9jm\") pod \"community-operators-d4zdm\" (UID: \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\") " pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.824792 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619ea327-b839-4e7a-8e23-f2a8ed6d4014-catalog-content\") pod \"community-operators-d4zdm\" (UID: \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\") " pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.824844 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619ea327-b839-4e7a-8e23-f2a8ed6d4014-utilities\") pod \"community-operators-d4zdm\" (UID: \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\") " pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.824989 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm9jm\" (UniqueName: \"kubernetes.io/projected/619ea327-b839-4e7a-8e23-f2a8ed6d4014-kube-api-access-gm9jm\") pod \"community-operators-d4zdm\" (UID: \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\") " pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.825277 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619ea327-b839-4e7a-8e23-f2a8ed6d4014-catalog-content\") pod \"community-operators-d4zdm\" (UID: \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\") " pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.825312 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619ea327-b839-4e7a-8e23-f2a8ed6d4014-utilities\") pod \"community-operators-d4zdm\" (UID: \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\") " pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.841351 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm9jm\" (UniqueName: \"kubernetes.io/projected/619ea327-b839-4e7a-8e23-f2a8ed6d4014-kube-api-access-gm9jm\") pod \"community-operators-d4zdm\" (UID: \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\") " pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:02 crc kubenswrapper[4563]: I1124 09:43:02.871654 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:03 crc kubenswrapper[4563]: I1124 09:43:03.245577 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d4zdm"] Nov 24 09:43:03 crc kubenswrapper[4563]: I1124 09:43:03.854256 4563 generic.go:334] "Generic (PLEG): container finished" podID="619ea327-b839-4e7a-8e23-f2a8ed6d4014" containerID="5dfc0956c5c24dcd99e5202614534ed8d2aff6de1c864e3d57d725375c3d3802" exitCode=0 Nov 24 09:43:03 crc kubenswrapper[4563]: I1124 09:43:03.854301 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4zdm" event={"ID":"619ea327-b839-4e7a-8e23-f2a8ed6d4014","Type":"ContainerDied","Data":"5dfc0956c5c24dcd99e5202614534ed8d2aff6de1c864e3d57d725375c3d3802"} Nov 24 09:43:03 crc kubenswrapper[4563]: I1124 09:43:03.854330 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4zdm" event={"ID":"619ea327-b839-4e7a-8e23-f2a8ed6d4014","Type":"ContainerStarted","Data":"8662d9f7dfa2d7198bf9b284a83cc4cf31f81ea155b8cda2a97ba377d679e0c2"} Nov 24 09:43:05 crc kubenswrapper[4563]: I1124 09:43:05.868681 4563 generic.go:334] "Generic (PLEG): container finished" podID="619ea327-b839-4e7a-8e23-f2a8ed6d4014" containerID="92f800f3a5eaba460a8af8a15cebbccd2e6e308d1defad6d2dd562a7f7872c64" exitCode=0 Nov 24 09:43:05 crc kubenswrapper[4563]: I1124 09:43:05.868792 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4zdm" event={"ID":"619ea327-b839-4e7a-8e23-f2a8ed6d4014","Type":"ContainerDied","Data":"92f800f3a5eaba460a8af8a15cebbccd2e6e308d1defad6d2dd562a7f7872c64"} Nov 24 09:43:06 crc kubenswrapper[4563]: I1124 09:43:06.878166 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4zdm" event={"ID":"619ea327-b839-4e7a-8e23-f2a8ed6d4014","Type":"ContainerStarted","Data":"50dbe4b46d333dcf86c574e34dcf2801d67b70ae6c46a49bae4ce2a654b80524"} Nov 24 09:43:06 crc kubenswrapper[4563]: I1124 09:43:06.898906 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d4zdm" podStartSLOduration=2.441292559 podStartE2EDuration="4.898891196s" podCreationTimestamp="2025-11-24 09:43:02 +0000 UTC" firstStartedPulling="2025-11-24 09:43:03.855526626 +0000 UTC m=+2361.114504073" lastFinishedPulling="2025-11-24 09:43:06.313125263 +0000 UTC m=+2363.572102710" observedRunningTime="2025-11-24 09:43:06.892750404 +0000 UTC m=+2364.151727851" watchObservedRunningTime="2025-11-24 09:43:06.898891196 +0000 UTC m=+2364.157868633" Nov 24 09:43:07 crc kubenswrapper[4563]: I1124 09:43:07.349559 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bnfml"] Nov 24 09:43:07 crc kubenswrapper[4563]: I1124 09:43:07.351903 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:07 crc kubenswrapper[4563]: I1124 09:43:07.358113 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bnfml"] Nov 24 09:43:07 crc kubenswrapper[4563]: I1124 09:43:07.396581 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f93ca7ee-207f-475f-b020-9b7fa8c54323-catalog-content\") pod \"redhat-operators-bnfml\" (UID: \"f93ca7ee-207f-475f-b020-9b7fa8c54323\") " pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:07 crc kubenswrapper[4563]: I1124 09:43:07.396814 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f93ca7ee-207f-475f-b020-9b7fa8c54323-utilities\") pod \"redhat-operators-bnfml\" (UID: \"f93ca7ee-207f-475f-b020-9b7fa8c54323\") " pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:07 crc kubenswrapper[4563]: I1124 09:43:07.396880 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nrwf\" (UniqueName: \"kubernetes.io/projected/f93ca7ee-207f-475f-b020-9b7fa8c54323-kube-api-access-6nrwf\") pod \"redhat-operators-bnfml\" (UID: \"f93ca7ee-207f-475f-b020-9b7fa8c54323\") " pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:07 crc kubenswrapper[4563]: I1124 09:43:07.498585 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f93ca7ee-207f-475f-b020-9b7fa8c54323-catalog-content\") pod \"redhat-operators-bnfml\" (UID: \"f93ca7ee-207f-475f-b020-9b7fa8c54323\") " pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:07 crc kubenswrapper[4563]: I1124 09:43:07.498768 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f93ca7ee-207f-475f-b020-9b7fa8c54323-utilities\") pod \"redhat-operators-bnfml\" (UID: \"f93ca7ee-207f-475f-b020-9b7fa8c54323\") " pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:07 crc kubenswrapper[4563]: I1124 09:43:07.498793 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nrwf\" (UniqueName: \"kubernetes.io/projected/f93ca7ee-207f-475f-b020-9b7fa8c54323-kube-api-access-6nrwf\") pod \"redhat-operators-bnfml\" (UID: \"f93ca7ee-207f-475f-b020-9b7fa8c54323\") " pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:07 crc kubenswrapper[4563]: I1124 09:43:07.499073 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f93ca7ee-207f-475f-b020-9b7fa8c54323-catalog-content\") pod \"redhat-operators-bnfml\" (UID: \"f93ca7ee-207f-475f-b020-9b7fa8c54323\") " pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:07 crc kubenswrapper[4563]: I1124 09:43:07.499221 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f93ca7ee-207f-475f-b020-9b7fa8c54323-utilities\") pod \"redhat-operators-bnfml\" (UID: \"f93ca7ee-207f-475f-b020-9b7fa8c54323\") " pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:07 crc kubenswrapper[4563]: I1124 09:43:07.514148 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nrwf\" (UniqueName: \"kubernetes.io/projected/f93ca7ee-207f-475f-b020-9b7fa8c54323-kube-api-access-6nrwf\") pod \"redhat-operators-bnfml\" (UID: \"f93ca7ee-207f-475f-b020-9b7fa8c54323\") " pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:07 crc kubenswrapper[4563]: I1124 09:43:07.666220 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:08 crc kubenswrapper[4563]: I1124 09:43:08.051962 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bnfml"] Nov 24 09:43:08 crc kubenswrapper[4563]: W1124 09:43:08.068706 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf93ca7ee_207f_475f_b020_9b7fa8c54323.slice/crio-c0c610f140fda7dcc6d3a946eb85cd04433522606c815280c96dfb6f987e5e6c WatchSource:0}: Error finding container c0c610f140fda7dcc6d3a946eb85cd04433522606c815280c96dfb6f987e5e6c: Status 404 returned error can't find the container with id c0c610f140fda7dcc6d3a946eb85cd04433522606c815280c96dfb6f987e5e6c Nov 24 09:43:08 crc kubenswrapper[4563]: I1124 09:43:08.892977 4563 generic.go:334] "Generic (PLEG): container finished" podID="f93ca7ee-207f-475f-b020-9b7fa8c54323" containerID="b7521f882c9d9bdd0c186d030a63c4dc98e2c6699ef8879368efbbccdf705ac1" exitCode=0 Nov 24 09:43:08 crc kubenswrapper[4563]: I1124 09:43:08.893181 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnfml" event={"ID":"f93ca7ee-207f-475f-b020-9b7fa8c54323","Type":"ContainerDied","Data":"b7521f882c9d9bdd0c186d030a63c4dc98e2c6699ef8879368efbbccdf705ac1"} Nov 24 09:43:08 crc kubenswrapper[4563]: I1124 09:43:08.893317 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnfml" event={"ID":"f93ca7ee-207f-475f-b020-9b7fa8c54323","Type":"ContainerStarted","Data":"c0c610f140fda7dcc6d3a946eb85cd04433522606c815280c96dfb6f987e5e6c"} Nov 24 09:43:08 crc kubenswrapper[4563]: I1124 09:43:08.987518 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:43:08 crc kubenswrapper[4563]: I1124 09:43:08.987572 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:43:09 crc kubenswrapper[4563]: I1124 09:43:09.902513 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnfml" event={"ID":"f93ca7ee-207f-475f-b020-9b7fa8c54323","Type":"ContainerStarted","Data":"3660bc3a5007fa86a7b21a3f37b30bec9ebc055415b4b70b4fe29c72ee53ed3f"} Nov 24 09:43:12 crc kubenswrapper[4563]: I1124 09:43:12.872471 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:12 crc kubenswrapper[4563]: I1124 09:43:12.872889 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:12 crc kubenswrapper[4563]: I1124 09:43:12.906215 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:12 crc kubenswrapper[4563]: I1124 09:43:12.925147 4563 generic.go:334] "Generic (PLEG): container finished" podID="f93ca7ee-207f-475f-b020-9b7fa8c54323" containerID="3660bc3a5007fa86a7b21a3f37b30bec9ebc055415b4b70b4fe29c72ee53ed3f" exitCode=0 Nov 24 09:43:12 crc kubenswrapper[4563]: I1124 09:43:12.925189 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnfml" event={"ID":"f93ca7ee-207f-475f-b020-9b7fa8c54323","Type":"ContainerDied","Data":"3660bc3a5007fa86a7b21a3f37b30bec9ebc055415b4b70b4fe29c72ee53ed3f"} Nov 24 09:43:12 crc kubenswrapper[4563]: I1124 09:43:12.956239 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:13 crc kubenswrapper[4563]: I1124 09:43:13.934294 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnfml" event={"ID":"f93ca7ee-207f-475f-b020-9b7fa8c54323","Type":"ContainerStarted","Data":"de0e4d231b36fce98edd0718f94365b6546c207fa168ad56ac511952e3fa14ec"} Nov 24 09:43:13 crc kubenswrapper[4563]: I1124 09:43:13.949280 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bnfml" podStartSLOduration=2.458691915 podStartE2EDuration="6.949267654s" podCreationTimestamp="2025-11-24 09:43:07 +0000 UTC" firstStartedPulling="2025-11-24 09:43:08.89471173 +0000 UTC m=+2366.153689177" lastFinishedPulling="2025-11-24 09:43:13.385287468 +0000 UTC m=+2370.644264916" observedRunningTime="2025-11-24 09:43:13.945424636 +0000 UTC m=+2371.204402084" watchObservedRunningTime="2025-11-24 09:43:13.949267654 +0000 UTC m=+2371.208245101" Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.344431 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d4zdm"] Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.345374 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d4zdm" podUID="619ea327-b839-4e7a-8e23-f2a8ed6d4014" containerName="registry-server" containerID="cri-o://50dbe4b46d333dcf86c574e34dcf2801d67b70ae6c46a49bae4ce2a654b80524" gracePeriod=2 Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.748307 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.945384 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619ea327-b839-4e7a-8e23-f2a8ed6d4014-utilities\") pod \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\" (UID: \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\") " Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.945828 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/619ea327-b839-4e7a-8e23-f2a8ed6d4014-utilities" (OuterVolumeSpecName: "utilities") pod "619ea327-b839-4e7a-8e23-f2a8ed6d4014" (UID: "619ea327-b839-4e7a-8e23-f2a8ed6d4014"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.945836 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm9jm\" (UniqueName: \"kubernetes.io/projected/619ea327-b839-4e7a-8e23-f2a8ed6d4014-kube-api-access-gm9jm\") pod \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\" (UID: \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\") " Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.945937 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619ea327-b839-4e7a-8e23-f2a8ed6d4014-catalog-content\") pod \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\" (UID: \"619ea327-b839-4e7a-8e23-f2a8ed6d4014\") " Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.946835 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/619ea327-b839-4e7a-8e23-f2a8ed6d4014-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.950662 4563 generic.go:334] "Generic (PLEG): container finished" podID="619ea327-b839-4e7a-8e23-f2a8ed6d4014" containerID="50dbe4b46d333dcf86c574e34dcf2801d67b70ae6c46a49bae4ce2a654b80524" exitCode=0 Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.950702 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4zdm" event={"ID":"619ea327-b839-4e7a-8e23-f2a8ed6d4014","Type":"ContainerDied","Data":"50dbe4b46d333dcf86c574e34dcf2801d67b70ae6c46a49bae4ce2a654b80524"} Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.950725 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4zdm" Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.950733 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4zdm" event={"ID":"619ea327-b839-4e7a-8e23-f2a8ed6d4014","Type":"ContainerDied","Data":"8662d9f7dfa2d7198bf9b284a83cc4cf31f81ea155b8cda2a97ba377d679e0c2"} Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.950762 4563 scope.go:117] "RemoveContainer" containerID="50dbe4b46d333dcf86c574e34dcf2801d67b70ae6c46a49bae4ce2a654b80524" Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.950968 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619ea327-b839-4e7a-8e23-f2a8ed6d4014-kube-api-access-gm9jm" (OuterVolumeSpecName: "kube-api-access-gm9jm") pod "619ea327-b839-4e7a-8e23-f2a8ed6d4014" (UID: "619ea327-b839-4e7a-8e23-f2a8ed6d4014"). InnerVolumeSpecName "kube-api-access-gm9jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.991200 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/619ea327-b839-4e7a-8e23-f2a8ed6d4014-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "619ea327-b839-4e7a-8e23-f2a8ed6d4014" (UID: "619ea327-b839-4e7a-8e23-f2a8ed6d4014"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:43:15 crc kubenswrapper[4563]: I1124 09:43:15.998377 4563 scope.go:117] "RemoveContainer" containerID="92f800f3a5eaba460a8af8a15cebbccd2e6e308d1defad6d2dd562a7f7872c64" Nov 24 09:43:16 crc kubenswrapper[4563]: I1124 09:43:16.019313 4563 scope.go:117] "RemoveContainer" containerID="5dfc0956c5c24dcd99e5202614534ed8d2aff6de1c864e3d57d725375c3d3802" Nov 24 09:43:16 crc kubenswrapper[4563]: I1124 09:43:16.048578 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm9jm\" (UniqueName: \"kubernetes.io/projected/619ea327-b839-4e7a-8e23-f2a8ed6d4014-kube-api-access-gm9jm\") on node \"crc\" DevicePath \"\"" Nov 24 09:43:16 crc kubenswrapper[4563]: I1124 09:43:16.048606 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/619ea327-b839-4e7a-8e23-f2a8ed6d4014-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:43:16 crc kubenswrapper[4563]: I1124 09:43:16.053192 4563 scope.go:117] "RemoveContainer" containerID="50dbe4b46d333dcf86c574e34dcf2801d67b70ae6c46a49bae4ce2a654b80524" Nov 24 09:43:16 crc kubenswrapper[4563]: E1124 09:43:16.053592 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50dbe4b46d333dcf86c574e34dcf2801d67b70ae6c46a49bae4ce2a654b80524\": container with ID starting with 50dbe4b46d333dcf86c574e34dcf2801d67b70ae6c46a49bae4ce2a654b80524 not found: ID does not exist" containerID="50dbe4b46d333dcf86c574e34dcf2801d67b70ae6c46a49bae4ce2a654b80524" Nov 24 09:43:16 crc kubenswrapper[4563]: I1124 09:43:16.053633 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50dbe4b46d333dcf86c574e34dcf2801d67b70ae6c46a49bae4ce2a654b80524"} err="failed to get container status \"50dbe4b46d333dcf86c574e34dcf2801d67b70ae6c46a49bae4ce2a654b80524\": rpc error: code = NotFound desc = could not find container \"50dbe4b46d333dcf86c574e34dcf2801d67b70ae6c46a49bae4ce2a654b80524\": container with ID starting with 50dbe4b46d333dcf86c574e34dcf2801d67b70ae6c46a49bae4ce2a654b80524 not found: ID does not exist" Nov 24 09:43:16 crc kubenswrapper[4563]: I1124 09:43:16.053675 4563 scope.go:117] "RemoveContainer" containerID="92f800f3a5eaba460a8af8a15cebbccd2e6e308d1defad6d2dd562a7f7872c64" Nov 24 09:43:16 crc kubenswrapper[4563]: E1124 09:43:16.054064 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92f800f3a5eaba460a8af8a15cebbccd2e6e308d1defad6d2dd562a7f7872c64\": container with ID starting with 92f800f3a5eaba460a8af8a15cebbccd2e6e308d1defad6d2dd562a7f7872c64 not found: ID does not exist" containerID="92f800f3a5eaba460a8af8a15cebbccd2e6e308d1defad6d2dd562a7f7872c64" Nov 24 09:43:16 crc kubenswrapper[4563]: I1124 09:43:16.054098 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92f800f3a5eaba460a8af8a15cebbccd2e6e308d1defad6d2dd562a7f7872c64"} err="failed to get container status \"92f800f3a5eaba460a8af8a15cebbccd2e6e308d1defad6d2dd562a7f7872c64\": rpc error: code = NotFound desc = could not find container \"92f800f3a5eaba460a8af8a15cebbccd2e6e308d1defad6d2dd562a7f7872c64\": container with ID starting with 92f800f3a5eaba460a8af8a15cebbccd2e6e308d1defad6d2dd562a7f7872c64 not found: ID does not exist" Nov 24 09:43:16 crc kubenswrapper[4563]: I1124 09:43:16.054143 4563 scope.go:117] "RemoveContainer" containerID="5dfc0956c5c24dcd99e5202614534ed8d2aff6de1c864e3d57d725375c3d3802" Nov 24 09:43:16 crc kubenswrapper[4563]: E1124 09:43:16.054502 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dfc0956c5c24dcd99e5202614534ed8d2aff6de1c864e3d57d725375c3d3802\": container with ID starting with 5dfc0956c5c24dcd99e5202614534ed8d2aff6de1c864e3d57d725375c3d3802 not found: ID does not exist" containerID="5dfc0956c5c24dcd99e5202614534ed8d2aff6de1c864e3d57d725375c3d3802" Nov 24 09:43:16 crc kubenswrapper[4563]: I1124 09:43:16.054532 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dfc0956c5c24dcd99e5202614534ed8d2aff6de1c864e3d57d725375c3d3802"} err="failed to get container status \"5dfc0956c5c24dcd99e5202614534ed8d2aff6de1c864e3d57d725375c3d3802\": rpc error: code = NotFound desc = could not find container \"5dfc0956c5c24dcd99e5202614534ed8d2aff6de1c864e3d57d725375c3d3802\": container with ID starting with 5dfc0956c5c24dcd99e5202614534ed8d2aff6de1c864e3d57d725375c3d3802 not found: ID does not exist" Nov 24 09:43:16 crc kubenswrapper[4563]: I1124 09:43:16.284232 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d4zdm"] Nov 24 09:43:16 crc kubenswrapper[4563]: I1124 09:43:16.290627 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d4zdm"] Nov 24 09:43:17 crc kubenswrapper[4563]: I1124 09:43:17.062825 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="619ea327-b839-4e7a-8e23-f2a8ed6d4014" path="/var/lib/kubelet/pods/619ea327-b839-4e7a-8e23-f2a8ed6d4014/volumes" Nov 24 09:43:17 crc kubenswrapper[4563]: I1124 09:43:17.666691 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:17 crc kubenswrapper[4563]: I1124 09:43:17.666780 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:18 crc kubenswrapper[4563]: I1124 09:43:18.701095 4563 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bnfml" podUID="f93ca7ee-207f-475f-b020-9b7fa8c54323" containerName="registry-server" probeResult="failure" output=< Nov 24 09:43:18 crc kubenswrapper[4563]: timeout: failed to connect service ":50051" within 1s Nov 24 09:43:18 crc kubenswrapper[4563]: > Nov 24 09:43:27 crc kubenswrapper[4563]: I1124 09:43:27.699151 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:27 crc kubenswrapper[4563]: I1124 09:43:27.732160 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:27 crc kubenswrapper[4563]: I1124 09:43:27.925490 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bnfml"] Nov 24 09:43:29 crc kubenswrapper[4563]: I1124 09:43:29.034209 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bnfml" podUID="f93ca7ee-207f-475f-b020-9b7fa8c54323" containerName="registry-server" containerID="cri-o://de0e4d231b36fce98edd0718f94365b6546c207fa168ad56ac511952e3fa14ec" gracePeriod=2 Nov 24 09:43:29 crc kubenswrapper[4563]: I1124 09:43:29.433190 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:29 crc kubenswrapper[4563]: I1124 09:43:29.560449 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f93ca7ee-207f-475f-b020-9b7fa8c54323-catalog-content\") pod \"f93ca7ee-207f-475f-b020-9b7fa8c54323\" (UID: \"f93ca7ee-207f-475f-b020-9b7fa8c54323\") " Nov 24 09:43:29 crc kubenswrapper[4563]: I1124 09:43:29.560495 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nrwf\" (UniqueName: \"kubernetes.io/projected/f93ca7ee-207f-475f-b020-9b7fa8c54323-kube-api-access-6nrwf\") pod \"f93ca7ee-207f-475f-b020-9b7fa8c54323\" (UID: \"f93ca7ee-207f-475f-b020-9b7fa8c54323\") " Nov 24 09:43:29 crc kubenswrapper[4563]: I1124 09:43:29.560544 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f93ca7ee-207f-475f-b020-9b7fa8c54323-utilities\") pod \"f93ca7ee-207f-475f-b020-9b7fa8c54323\" (UID: \"f93ca7ee-207f-475f-b020-9b7fa8c54323\") " Nov 24 09:43:29 crc kubenswrapper[4563]: I1124 09:43:29.561384 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f93ca7ee-207f-475f-b020-9b7fa8c54323-utilities" (OuterVolumeSpecName: "utilities") pod "f93ca7ee-207f-475f-b020-9b7fa8c54323" (UID: "f93ca7ee-207f-475f-b020-9b7fa8c54323"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:43:29 crc kubenswrapper[4563]: I1124 09:43:29.564738 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93ca7ee-207f-475f-b020-9b7fa8c54323-kube-api-access-6nrwf" (OuterVolumeSpecName: "kube-api-access-6nrwf") pod "f93ca7ee-207f-475f-b020-9b7fa8c54323" (UID: "f93ca7ee-207f-475f-b020-9b7fa8c54323"). InnerVolumeSpecName "kube-api-access-6nrwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:43:29 crc kubenswrapper[4563]: I1124 09:43:29.625460 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f93ca7ee-207f-475f-b020-9b7fa8c54323-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f93ca7ee-207f-475f-b020-9b7fa8c54323" (UID: "f93ca7ee-207f-475f-b020-9b7fa8c54323"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:43:29 crc kubenswrapper[4563]: I1124 09:43:29.662678 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f93ca7ee-207f-475f-b020-9b7fa8c54323-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:43:29 crc kubenswrapper[4563]: I1124 09:43:29.662703 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nrwf\" (UniqueName: \"kubernetes.io/projected/f93ca7ee-207f-475f-b020-9b7fa8c54323-kube-api-access-6nrwf\") on node \"crc\" DevicePath \"\"" Nov 24 09:43:29 crc kubenswrapper[4563]: I1124 09:43:29.662716 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f93ca7ee-207f-475f-b020-9b7fa8c54323-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.043054 4563 generic.go:334] "Generic (PLEG): container finished" podID="f93ca7ee-207f-475f-b020-9b7fa8c54323" containerID="de0e4d231b36fce98edd0718f94365b6546c207fa168ad56ac511952e3fa14ec" exitCode=0 Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.043107 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnfml" event={"ID":"f93ca7ee-207f-475f-b020-9b7fa8c54323","Type":"ContainerDied","Data":"de0e4d231b36fce98edd0718f94365b6546c207fa168ad56ac511952e3fa14ec"} Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.043134 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bnfml" Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.043148 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bnfml" event={"ID":"f93ca7ee-207f-475f-b020-9b7fa8c54323","Type":"ContainerDied","Data":"c0c610f140fda7dcc6d3a946eb85cd04433522606c815280c96dfb6f987e5e6c"} Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.043167 4563 scope.go:117] "RemoveContainer" containerID="de0e4d231b36fce98edd0718f94365b6546c207fa168ad56ac511952e3fa14ec" Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.061343 4563 scope.go:117] "RemoveContainer" containerID="3660bc3a5007fa86a7b21a3f37b30bec9ebc055415b4b70b4fe29c72ee53ed3f" Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.068250 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bnfml"] Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.074176 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bnfml"] Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.089738 4563 scope.go:117] "RemoveContainer" containerID="b7521f882c9d9bdd0c186d030a63c4dc98e2c6699ef8879368efbbccdf705ac1" Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.109096 4563 scope.go:117] "RemoveContainer" containerID="de0e4d231b36fce98edd0718f94365b6546c207fa168ad56ac511952e3fa14ec" Nov 24 09:43:30 crc kubenswrapper[4563]: E1124 09:43:30.109403 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de0e4d231b36fce98edd0718f94365b6546c207fa168ad56ac511952e3fa14ec\": container with ID starting with de0e4d231b36fce98edd0718f94365b6546c207fa168ad56ac511952e3fa14ec not found: ID does not exist" containerID="de0e4d231b36fce98edd0718f94365b6546c207fa168ad56ac511952e3fa14ec" Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.109438 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0e4d231b36fce98edd0718f94365b6546c207fa168ad56ac511952e3fa14ec"} err="failed to get container status \"de0e4d231b36fce98edd0718f94365b6546c207fa168ad56ac511952e3fa14ec\": rpc error: code = NotFound desc = could not find container \"de0e4d231b36fce98edd0718f94365b6546c207fa168ad56ac511952e3fa14ec\": container with ID starting with de0e4d231b36fce98edd0718f94365b6546c207fa168ad56ac511952e3fa14ec not found: ID does not exist" Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.109465 4563 scope.go:117] "RemoveContainer" containerID="3660bc3a5007fa86a7b21a3f37b30bec9ebc055415b4b70b4fe29c72ee53ed3f" Nov 24 09:43:30 crc kubenswrapper[4563]: E1124 09:43:30.109743 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3660bc3a5007fa86a7b21a3f37b30bec9ebc055415b4b70b4fe29c72ee53ed3f\": container with ID starting with 3660bc3a5007fa86a7b21a3f37b30bec9ebc055415b4b70b4fe29c72ee53ed3f not found: ID does not exist" containerID="3660bc3a5007fa86a7b21a3f37b30bec9ebc055415b4b70b4fe29c72ee53ed3f" Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.109771 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3660bc3a5007fa86a7b21a3f37b30bec9ebc055415b4b70b4fe29c72ee53ed3f"} err="failed to get container status \"3660bc3a5007fa86a7b21a3f37b30bec9ebc055415b4b70b4fe29c72ee53ed3f\": rpc error: code = NotFound desc = could not find container \"3660bc3a5007fa86a7b21a3f37b30bec9ebc055415b4b70b4fe29c72ee53ed3f\": container with ID starting with 3660bc3a5007fa86a7b21a3f37b30bec9ebc055415b4b70b4fe29c72ee53ed3f not found: ID does not exist" Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.109789 4563 scope.go:117] "RemoveContainer" containerID="b7521f882c9d9bdd0c186d030a63c4dc98e2c6699ef8879368efbbccdf705ac1" Nov 24 09:43:30 crc kubenswrapper[4563]: E1124 09:43:30.110034 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7521f882c9d9bdd0c186d030a63c4dc98e2c6699ef8879368efbbccdf705ac1\": container with ID starting with b7521f882c9d9bdd0c186d030a63c4dc98e2c6699ef8879368efbbccdf705ac1 not found: ID does not exist" containerID="b7521f882c9d9bdd0c186d030a63c4dc98e2c6699ef8879368efbbccdf705ac1" Nov 24 09:43:30 crc kubenswrapper[4563]: I1124 09:43:30.110080 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7521f882c9d9bdd0c186d030a63c4dc98e2c6699ef8879368efbbccdf705ac1"} err="failed to get container status \"b7521f882c9d9bdd0c186d030a63c4dc98e2c6699ef8879368efbbccdf705ac1\": rpc error: code = NotFound desc = could not find container \"b7521f882c9d9bdd0c186d030a63c4dc98e2c6699ef8879368efbbccdf705ac1\": container with ID starting with b7521f882c9d9bdd0c186d030a63c4dc98e2c6699ef8879368efbbccdf705ac1 not found: ID does not exist" Nov 24 09:43:31 crc kubenswrapper[4563]: I1124 09:43:31.066842 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f93ca7ee-207f-475f-b020-9b7fa8c54323" path="/var/lib/kubelet/pods/f93ca7ee-207f-475f-b020-9b7fa8c54323/volumes" Nov 24 09:43:38 crc kubenswrapper[4563]: I1124 09:43:38.988056 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:43:38 crc kubenswrapper[4563]: I1124 09:43:38.989121 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:43:38 crc kubenswrapper[4563]: I1124 09:43:38.989237 4563 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:43:38 crc kubenswrapper[4563]: I1124 09:43:38.989806 4563 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610"} pod="openshift-machine-config-operator/machine-config-daemon-stlxr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 09:43:38 crc kubenswrapper[4563]: I1124 09:43:38.989944 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" containerID="cri-o://4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" gracePeriod=600 Nov 24 09:43:39 crc kubenswrapper[4563]: E1124 09:43:39.109673 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:43:40 crc kubenswrapper[4563]: I1124 09:43:40.105984 4563 generic.go:334] "Generic (PLEG): container finished" podID="3b2bfe55-8989-49b3-bb61-e28189447627" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" exitCode=0 Nov 24 09:43:40 crc kubenswrapper[4563]: I1124 09:43:40.106056 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerDied","Data":"4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610"} Nov 24 09:43:40 crc kubenswrapper[4563]: I1124 09:43:40.106220 4563 scope.go:117] "RemoveContainer" containerID="aacc3722af1959c6fbbc5ddeea8e84512b7ce34c8a0eca9a4336a8f163950d41" Nov 24 09:43:40 crc kubenswrapper[4563]: I1124 09:43:40.107071 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:43:40 crc kubenswrapper[4563]: E1124 09:43:40.107293 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:43:55 crc kubenswrapper[4563]: I1124 09:43:55.055403 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:43:55 crc kubenswrapper[4563]: E1124 09:43:55.055991 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:44:10 crc kubenswrapper[4563]: I1124 09:44:10.054193 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:44:10 crc kubenswrapper[4563]: E1124 09:44:10.055330 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:44:24 crc kubenswrapper[4563]: I1124 09:44:24.054912 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:44:24 crc kubenswrapper[4563]: E1124 09:44:24.055818 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:44:36 crc kubenswrapper[4563]: I1124 09:44:36.054940 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:44:36 crc kubenswrapper[4563]: E1124 09:44:36.056067 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:44:49 crc kubenswrapper[4563]: I1124 09:44:49.054608 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:44:49 crc kubenswrapper[4563]: E1124 09:44:49.055102 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.133952 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh"] Nov 24 09:45:00 crc kubenswrapper[4563]: E1124 09:45:00.135022 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619ea327-b839-4e7a-8e23-f2a8ed6d4014" containerName="registry-server" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.135039 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="619ea327-b839-4e7a-8e23-f2a8ed6d4014" containerName="registry-server" Nov 24 09:45:00 crc kubenswrapper[4563]: E1124 09:45:00.135054 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93ca7ee-207f-475f-b020-9b7fa8c54323" containerName="extract-utilities" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.135059 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93ca7ee-207f-475f-b020-9b7fa8c54323" containerName="extract-utilities" Nov 24 09:45:00 crc kubenswrapper[4563]: E1124 09:45:00.135075 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619ea327-b839-4e7a-8e23-f2a8ed6d4014" containerName="extract-content" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.135080 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="619ea327-b839-4e7a-8e23-f2a8ed6d4014" containerName="extract-content" Nov 24 09:45:00 crc kubenswrapper[4563]: E1124 09:45:00.135094 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93ca7ee-207f-475f-b020-9b7fa8c54323" containerName="extract-content" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.135099 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93ca7ee-207f-475f-b020-9b7fa8c54323" containerName="extract-content" Nov 24 09:45:00 crc kubenswrapper[4563]: E1124 09:45:00.135126 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619ea327-b839-4e7a-8e23-f2a8ed6d4014" containerName="extract-utilities" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.135132 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="619ea327-b839-4e7a-8e23-f2a8ed6d4014" containerName="extract-utilities" Nov 24 09:45:00 crc kubenswrapper[4563]: E1124 09:45:00.135143 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93ca7ee-207f-475f-b020-9b7fa8c54323" containerName="registry-server" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.135149 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93ca7ee-207f-475f-b020-9b7fa8c54323" containerName="registry-server" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.135337 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93ca7ee-207f-475f-b020-9b7fa8c54323" containerName="registry-server" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.135353 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="619ea327-b839-4e7a-8e23-f2a8ed6d4014" containerName="registry-server" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.136001 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.138045 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.138169 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.147905 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh"] Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.189863 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc6fb\" (UniqueName: \"kubernetes.io/projected/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-kube-api-access-wc6fb\") pod \"collect-profiles-29399625-v9hqh\" (UID: \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.189947 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-config-volume\") pod \"collect-profiles-29399625-v9hqh\" (UID: \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.190224 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-secret-volume\") pod \"collect-profiles-29399625-v9hqh\" (UID: \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.291315 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-config-volume\") pod \"collect-profiles-29399625-v9hqh\" (UID: \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.291454 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-secret-volume\") pod \"collect-profiles-29399625-v9hqh\" (UID: \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.291506 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc6fb\" (UniqueName: \"kubernetes.io/projected/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-kube-api-access-wc6fb\") pod \"collect-profiles-29399625-v9hqh\" (UID: \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.292372 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-config-volume\") pod \"collect-profiles-29399625-v9hqh\" (UID: \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.297390 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-secret-volume\") pod \"collect-profiles-29399625-v9hqh\" (UID: \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.306825 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc6fb\" (UniqueName: \"kubernetes.io/projected/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-kube-api-access-wc6fb\") pod \"collect-profiles-29399625-v9hqh\" (UID: \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.455544 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" Nov 24 09:45:00 crc kubenswrapper[4563]: I1124 09:45:00.842298 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh"] Nov 24 09:45:01 crc kubenswrapper[4563]: I1124 09:45:01.653166 4563 generic.go:334] "Generic (PLEG): container finished" podID="ef7abbbb-a098-4e11-9133-8f7d8faed6a5" containerID="dcf741e09b185e53d429d5eaff5c5c58b418f46093070977660f724f4f15bac2" exitCode=0 Nov 24 09:45:01 crc kubenswrapper[4563]: I1124 09:45:01.653389 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" event={"ID":"ef7abbbb-a098-4e11-9133-8f7d8faed6a5","Type":"ContainerDied","Data":"dcf741e09b185e53d429d5eaff5c5c58b418f46093070977660f724f4f15bac2"} Nov 24 09:45:01 crc kubenswrapper[4563]: I1124 09:45:01.653437 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" event={"ID":"ef7abbbb-a098-4e11-9133-8f7d8faed6a5","Type":"ContainerStarted","Data":"1fe9d5b09498d82be009ad4ee1efeaff8c1fa8a796d6e094584a0aad5305c204"} Nov 24 09:45:02 crc kubenswrapper[4563]: I1124 09:45:02.054268 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:45:02 crc kubenswrapper[4563]: E1124 09:45:02.054585 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:45:02 crc kubenswrapper[4563]: I1124 09:45:02.945510 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" Nov 24 09:45:03 crc kubenswrapper[4563]: I1124 09:45:03.141808 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-secret-volume\") pod \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\" (UID: \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\") " Nov 24 09:45:03 crc kubenswrapper[4563]: I1124 09:45:03.142115 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-config-volume\") pod \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\" (UID: \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\") " Nov 24 09:45:03 crc kubenswrapper[4563]: I1124 09:45:03.142187 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc6fb\" (UniqueName: \"kubernetes.io/projected/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-kube-api-access-wc6fb\") pod \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\" (UID: \"ef7abbbb-a098-4e11-9133-8f7d8faed6a5\") " Nov 24 09:45:03 crc kubenswrapper[4563]: I1124 09:45:03.142881 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "ef7abbbb-a098-4e11-9133-8f7d8faed6a5" (UID: "ef7abbbb-a098-4e11-9133-8f7d8faed6a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:45:03 crc kubenswrapper[4563]: I1124 09:45:03.148493 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ef7abbbb-a098-4e11-9133-8f7d8faed6a5" (UID: "ef7abbbb-a098-4e11-9133-8f7d8faed6a5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:45:03 crc kubenswrapper[4563]: I1124 09:45:03.149267 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-kube-api-access-wc6fb" (OuterVolumeSpecName: "kube-api-access-wc6fb") pod "ef7abbbb-a098-4e11-9133-8f7d8faed6a5" (UID: "ef7abbbb-a098-4e11-9133-8f7d8faed6a5"). InnerVolumeSpecName "kube-api-access-wc6fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:45:03 crc kubenswrapper[4563]: I1124 09:45:03.246243 4563 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 09:45:03 crc kubenswrapper[4563]: I1124 09:45:03.246273 4563 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 09:45:03 crc kubenswrapper[4563]: I1124 09:45:03.246285 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc6fb\" (UniqueName: \"kubernetes.io/projected/ef7abbbb-a098-4e11-9133-8f7d8faed6a5-kube-api-access-wc6fb\") on node \"crc\" DevicePath \"\"" Nov 24 09:45:03 crc kubenswrapper[4563]: I1124 09:45:03.668710 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" event={"ID":"ef7abbbb-a098-4e11-9133-8f7d8faed6a5","Type":"ContainerDied","Data":"1fe9d5b09498d82be009ad4ee1efeaff8c1fa8a796d6e094584a0aad5305c204"} Nov 24 09:45:03 crc kubenswrapper[4563]: I1124 09:45:03.668742 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399625-v9hqh" Nov 24 09:45:03 crc kubenswrapper[4563]: I1124 09:45:03.668749 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fe9d5b09498d82be009ad4ee1efeaff8c1fa8a796d6e094584a0aad5305c204" Nov 24 09:45:04 crc kubenswrapper[4563]: I1124 09:45:04.000370 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5"] Nov 24 09:45:04 crc kubenswrapper[4563]: I1124 09:45:04.008174 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399580-sccd5"] Nov 24 09:45:05 crc kubenswrapper[4563]: I1124 09:45:05.062806 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f15c383-9eb5-4942-a63d-48e54beea23d" path="/var/lib/kubelet/pods/6f15c383-9eb5-4942-a63d-48e54beea23d/volumes" Nov 24 09:45:17 crc kubenswrapper[4563]: I1124 09:45:17.054459 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:45:17 crc kubenswrapper[4563]: E1124 09:45:17.055230 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:45:31 crc kubenswrapper[4563]: I1124 09:45:31.054343 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:45:31 crc kubenswrapper[4563]: E1124 09:45:31.055214 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:45:42 crc kubenswrapper[4563]: I1124 09:45:42.054503 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:45:42 crc kubenswrapper[4563]: E1124 09:45:42.055364 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:45:52 crc kubenswrapper[4563]: I1124 09:45:52.751683 4563 scope.go:117] "RemoveContainer" containerID="6877ba89bd141ca57d4b33eee569ea5163b25c9d082bd9bb4644097c9de2b7f2" Nov 24 09:45:55 crc kubenswrapper[4563]: I1124 09:45:55.055117 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:45:55 crc kubenswrapper[4563]: E1124 09:45:55.055763 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:46:10 crc kubenswrapper[4563]: I1124 09:46:10.055122 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:46:10 crc kubenswrapper[4563]: E1124 09:46:10.056256 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:46:25 crc kubenswrapper[4563]: I1124 09:46:25.055399 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:46:25 crc kubenswrapper[4563]: E1124 09:46:25.056555 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:46:38 crc kubenswrapper[4563]: I1124 09:46:38.055168 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:46:38 crc kubenswrapper[4563]: E1124 09:46:38.056067 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:46:52 crc kubenswrapper[4563]: I1124 09:46:52.054565 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:46:52 crc kubenswrapper[4563]: E1124 09:46:52.055420 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:47:04 crc kubenswrapper[4563]: I1124 09:47:04.055201 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:47:04 crc kubenswrapper[4563]: E1124 09:47:04.055830 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:47:16 crc kubenswrapper[4563]: I1124 09:47:16.054381 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:47:16 crc kubenswrapper[4563]: E1124 09:47:16.054873 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:47:29 crc kubenswrapper[4563]: I1124 09:47:29.055112 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:47:29 crc kubenswrapper[4563]: E1124 09:47:29.056001 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:47:40 crc kubenswrapper[4563]: I1124 09:47:40.054521 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:47:40 crc kubenswrapper[4563]: E1124 09:47:40.055389 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:47:54 crc kubenswrapper[4563]: I1124 09:47:54.054948 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:47:54 crc kubenswrapper[4563]: E1124 09:47:54.055708 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:48:07 crc kubenswrapper[4563]: I1124 09:48:07.054907 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:48:07 crc kubenswrapper[4563]: E1124 09:48:07.055707 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:48:21 crc kubenswrapper[4563]: I1124 09:48:21.054382 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:48:21 crc kubenswrapper[4563]: E1124 09:48:21.055192 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:48:32 crc kubenswrapper[4563]: I1124 09:48:32.054424 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:48:32 crc kubenswrapper[4563]: E1124 09:48:32.056285 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:48:45 crc kubenswrapper[4563]: I1124 09:48:45.055743 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:48:45 crc kubenswrapper[4563]: I1124 09:48:45.365459 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"61bede8c7960fe7299ad0cfd68d688b281ba1733220ddc88fefa904a4696a51c"} Nov 24 09:49:29 crc kubenswrapper[4563]: I1124 09:49:29.744518 4563 generic.go:334] "Generic (PLEG): container finished" podID="d15e06ff-83ac-44e9-aebe-9756628722e6" containerID="bff15d7286e8c850efd484a0717502e97e15313eca4f3b4fb2fd32320310b20d" exitCode=0 Nov 24 09:49:29 crc kubenswrapper[4563]: I1124 09:49:29.744750 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d15e06ff-83ac-44e9-aebe-9756628722e6","Type":"ContainerDied","Data":"bff15d7286e8c850efd484a0717502e97e15313eca4f3b4fb2fd32320310b20d"} Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.053046 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.148843 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-openstack-config-secret\") pod \"d15e06ff-83ac-44e9-aebe-9756628722e6\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.148878 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-ssh-key\") pod \"d15e06ff-83ac-44e9-aebe-9756628722e6\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.148902 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d15e06ff-83ac-44e9-aebe-9756628722e6-test-operator-ephemeral-workdir\") pod \"d15e06ff-83ac-44e9-aebe-9756628722e6\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.148948 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px7fv\" (UniqueName: \"kubernetes.io/projected/d15e06ff-83ac-44e9-aebe-9756628722e6-kube-api-access-px7fv\") pod \"d15e06ff-83ac-44e9-aebe-9756628722e6\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.148973 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-ca-certs\") pod \"d15e06ff-83ac-44e9-aebe-9756628722e6\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.148989 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"d15e06ff-83ac-44e9-aebe-9756628722e6\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.149012 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d15e06ff-83ac-44e9-aebe-9756628722e6-config-data\") pod \"d15e06ff-83ac-44e9-aebe-9756628722e6\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.149033 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d15e06ff-83ac-44e9-aebe-9756628722e6-test-operator-ephemeral-temporary\") pod \"d15e06ff-83ac-44e9-aebe-9756628722e6\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.149059 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d15e06ff-83ac-44e9-aebe-9756628722e6-openstack-config\") pod \"d15e06ff-83ac-44e9-aebe-9756628722e6\" (UID: \"d15e06ff-83ac-44e9-aebe-9756628722e6\") " Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.149565 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d15e06ff-83ac-44e9-aebe-9756628722e6-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "d15e06ff-83ac-44e9-aebe-9756628722e6" (UID: "d15e06ff-83ac-44e9-aebe-9756628722e6"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.149767 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15e06ff-83ac-44e9-aebe-9756628722e6-config-data" (OuterVolumeSpecName: "config-data") pod "d15e06ff-83ac-44e9-aebe-9756628722e6" (UID: "d15e06ff-83ac-44e9-aebe-9756628722e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.152927 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d15e06ff-83ac-44e9-aebe-9756628722e6-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "d15e06ff-83ac-44e9-aebe-9756628722e6" (UID: "d15e06ff-83ac-44e9-aebe-9756628722e6"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.154331 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "d15e06ff-83ac-44e9-aebe-9756628722e6" (UID: "d15e06ff-83ac-44e9-aebe-9756628722e6"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.154488 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d15e06ff-83ac-44e9-aebe-9756628722e6-kube-api-access-px7fv" (OuterVolumeSpecName: "kube-api-access-px7fv") pod "d15e06ff-83ac-44e9-aebe-9756628722e6" (UID: "d15e06ff-83ac-44e9-aebe-9756628722e6"). InnerVolumeSpecName "kube-api-access-px7fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.173162 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d15e06ff-83ac-44e9-aebe-9756628722e6" (UID: "d15e06ff-83ac-44e9-aebe-9756628722e6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.173182 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "d15e06ff-83ac-44e9-aebe-9756628722e6" (UID: "d15e06ff-83ac-44e9-aebe-9756628722e6"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.174808 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d15e06ff-83ac-44e9-aebe-9756628722e6" (UID: "d15e06ff-83ac-44e9-aebe-9756628722e6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.187581 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15e06ff-83ac-44e9-aebe-9756628722e6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d15e06ff-83ac-44e9-aebe-9756628722e6" (UID: "d15e06ff-83ac-44e9-aebe-9756628722e6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.251050 4563 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.251079 4563 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.251091 4563 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d15e06ff-83ac-44e9-aebe-9756628722e6-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.251103 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px7fv\" (UniqueName: \"kubernetes.io/projected/d15e06ff-83ac-44e9-aebe-9756628722e6-kube-api-access-px7fv\") on node \"crc\" DevicePath \"\"" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.251113 4563 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d15e06ff-83ac-44e9-aebe-9756628722e6-ca-certs\") on node \"crc\" DevicePath \"\"" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.251141 4563 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.251152 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d15e06ff-83ac-44e9-aebe-9756628722e6-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.251162 4563 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d15e06ff-83ac-44e9-aebe-9756628722e6-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.251173 4563 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d15e06ff-83ac-44e9-aebe-9756628722e6-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.267521 4563 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.354044 4563 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.764934 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d15e06ff-83ac-44e9-aebe-9756628722e6","Type":"ContainerDied","Data":"48d83a625135a00ab5e632c2f8ebb9059ff28833e1cbafce66ce33efd48207fa"} Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.765144 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48d83a625135a00ab5e632c2f8ebb9059ff28833e1cbafce66ce33efd48207fa" Nov 24 09:49:31 crc kubenswrapper[4563]: I1124 09:49:31.764984 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.686040 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 24 09:49:33 crc kubenswrapper[4563]: E1124 09:49:33.687000 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15e06ff-83ac-44e9-aebe-9756628722e6" containerName="tempest-tests-tempest-tests-runner" Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.687016 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15e06ff-83ac-44e9-aebe-9756628722e6" containerName="tempest-tests-tempest-tests-runner" Nov 24 09:49:33 crc kubenswrapper[4563]: E1124 09:49:33.687044 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7abbbb-a098-4e11-9133-8f7d8faed6a5" containerName="collect-profiles" Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.687051 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7abbbb-a098-4e11-9133-8f7d8faed6a5" containerName="collect-profiles" Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.687290 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef7abbbb-a098-4e11-9133-8f7d8faed6a5" containerName="collect-profiles" Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.687319 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15e06ff-83ac-44e9-aebe-9756628722e6" containerName="tempest-tests-tempest-tests-runner" Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.688250 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.691214 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nx56k" Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.694373 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.792196 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57sqq\" (UniqueName: \"kubernetes.io/projected/b8a9086a-b375-485e-990a-27b6e4832c77-kube-api-access-57sqq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8a9086a-b375-485e-990a-27b6e4832c77\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.792605 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8a9086a-b375-485e-990a-27b6e4832c77\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.894578 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8a9086a-b375-485e-990a-27b6e4832c77\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.894843 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57sqq\" (UniqueName: \"kubernetes.io/projected/b8a9086a-b375-485e-990a-27b6e4832c77-kube-api-access-57sqq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8a9086a-b375-485e-990a-27b6e4832c77\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.894905 4563 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8a9086a-b375-485e-990a-27b6e4832c77\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.913473 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57sqq\" (UniqueName: \"kubernetes.io/projected/b8a9086a-b375-485e-990a-27b6e4832c77-kube-api-access-57sqq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8a9086a-b375-485e-990a-27b6e4832c77\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 24 09:49:33 crc kubenswrapper[4563]: I1124 09:49:33.917882 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b8a9086a-b375-485e-990a-27b6e4832c77\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 24 09:49:34 crc kubenswrapper[4563]: I1124 09:49:34.005949 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 24 09:49:34 crc kubenswrapper[4563]: I1124 09:49:34.373569 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 24 09:49:34 crc kubenswrapper[4563]: I1124 09:49:34.380869 4563 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 24 09:49:34 crc kubenswrapper[4563]: I1124 09:49:34.786328 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"b8a9086a-b375-485e-990a-27b6e4832c77","Type":"ContainerStarted","Data":"2d5ab60a979c6e2c98b8aa5f5f48fd9f9dd0b1aebdfdb85bd43d4d0f1c5bbff5"} Nov 24 09:49:35 crc kubenswrapper[4563]: I1124 09:49:35.795960 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"b8a9086a-b375-485e-990a-27b6e4832c77","Type":"ContainerStarted","Data":"2fbd83c34c68d08805f40552ce9015bcaf3be1a6f5828bd9d17d1cdf5de1ad9d"} Nov 24 09:49:35 crc kubenswrapper[4563]: I1124 09:49:35.808954 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.7199542669999999 podStartE2EDuration="2.80894292s" podCreationTimestamp="2025-11-24 09:49:33 +0000 UTC" firstStartedPulling="2025-11-24 09:49:34.380553366 +0000 UTC m=+2751.639530813" lastFinishedPulling="2025-11-24 09:49:35.469542019 +0000 UTC m=+2752.728519466" observedRunningTime="2025-11-24 09:49:35.807428223 +0000 UTC m=+2753.066405670" watchObservedRunningTime="2025-11-24 09:49:35.80894292 +0000 UTC m=+2753.067920367" Nov 24 09:49:53 crc kubenswrapper[4563]: I1124 09:49:53.887653 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vndt9/must-gather-gxrkl"] Nov 24 09:49:53 crc kubenswrapper[4563]: I1124 09:49:53.889700 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/must-gather-gxrkl" Nov 24 09:49:53 crc kubenswrapper[4563]: I1124 09:49:53.894888 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vndt9"/"openshift-service-ca.crt" Nov 24 09:49:53 crc kubenswrapper[4563]: I1124 09:49:53.897753 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vndt9"/"kube-root-ca.crt" Nov 24 09:49:53 crc kubenswrapper[4563]: I1124 09:49:53.937402 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vndt9/must-gather-gxrkl"] Nov 24 09:49:53 crc kubenswrapper[4563]: I1124 09:49:53.956794 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff74dd16-80eb-41bb-93bb-91e9f6f96748-must-gather-output\") pod \"must-gather-gxrkl\" (UID: \"ff74dd16-80eb-41bb-93bb-91e9f6f96748\") " pod="openshift-must-gather-vndt9/must-gather-gxrkl" Nov 24 09:49:53 crc kubenswrapper[4563]: I1124 09:49:53.957165 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8r7z\" (UniqueName: \"kubernetes.io/projected/ff74dd16-80eb-41bb-93bb-91e9f6f96748-kube-api-access-q8r7z\") pod \"must-gather-gxrkl\" (UID: \"ff74dd16-80eb-41bb-93bb-91e9f6f96748\") " pod="openshift-must-gather-vndt9/must-gather-gxrkl" Nov 24 09:49:54 crc kubenswrapper[4563]: I1124 09:49:54.059037 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff74dd16-80eb-41bb-93bb-91e9f6f96748-must-gather-output\") pod \"must-gather-gxrkl\" (UID: \"ff74dd16-80eb-41bb-93bb-91e9f6f96748\") " pod="openshift-must-gather-vndt9/must-gather-gxrkl" Nov 24 09:49:54 crc kubenswrapper[4563]: I1124 09:49:54.059286 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8r7z\" (UniqueName: \"kubernetes.io/projected/ff74dd16-80eb-41bb-93bb-91e9f6f96748-kube-api-access-q8r7z\") pod \"must-gather-gxrkl\" (UID: \"ff74dd16-80eb-41bb-93bb-91e9f6f96748\") " pod="openshift-must-gather-vndt9/must-gather-gxrkl" Nov 24 09:49:54 crc kubenswrapper[4563]: I1124 09:49:54.059470 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff74dd16-80eb-41bb-93bb-91e9f6f96748-must-gather-output\") pod \"must-gather-gxrkl\" (UID: \"ff74dd16-80eb-41bb-93bb-91e9f6f96748\") " pod="openshift-must-gather-vndt9/must-gather-gxrkl" Nov 24 09:49:54 crc kubenswrapper[4563]: I1124 09:49:54.078238 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8r7z\" (UniqueName: \"kubernetes.io/projected/ff74dd16-80eb-41bb-93bb-91e9f6f96748-kube-api-access-q8r7z\") pod \"must-gather-gxrkl\" (UID: \"ff74dd16-80eb-41bb-93bb-91e9f6f96748\") " pod="openshift-must-gather-vndt9/must-gather-gxrkl" Nov 24 09:49:54 crc kubenswrapper[4563]: I1124 09:49:54.205239 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/must-gather-gxrkl" Nov 24 09:49:54 crc kubenswrapper[4563]: I1124 09:49:54.615686 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vndt9/must-gather-gxrkl"] Nov 24 09:49:54 crc kubenswrapper[4563]: W1124 09:49:54.618765 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff74dd16_80eb_41bb_93bb_91e9f6f96748.slice/crio-0fbbd79e3a57e285bbd39379e52e3d794547e4d5a6d18e936ce032968652291e WatchSource:0}: Error finding container 0fbbd79e3a57e285bbd39379e52e3d794547e4d5a6d18e936ce032968652291e: Status 404 returned error can't find the container with id 0fbbd79e3a57e285bbd39379e52e3d794547e4d5a6d18e936ce032968652291e Nov 24 09:49:54 crc kubenswrapper[4563]: I1124 09:49:54.940346 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vndt9/must-gather-gxrkl" event={"ID":"ff74dd16-80eb-41bb-93bb-91e9f6f96748","Type":"ContainerStarted","Data":"0fbbd79e3a57e285bbd39379e52e3d794547e4d5a6d18e936ce032968652291e"} Nov 24 09:50:00 crc kubenswrapper[4563]: I1124 09:50:00.995182 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vndt9/must-gather-gxrkl" event={"ID":"ff74dd16-80eb-41bb-93bb-91e9f6f96748","Type":"ContainerStarted","Data":"699c09ebab645394b3d5f865e79a18c61b7cecd095639fad1d525aeff5d4a0e5"} Nov 24 09:50:00 crc kubenswrapper[4563]: I1124 09:50:00.995756 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vndt9/must-gather-gxrkl" event={"ID":"ff74dd16-80eb-41bb-93bb-91e9f6f96748","Type":"ContainerStarted","Data":"fe5b7a77e60574de53067cedb6181b0255c0f8f4da729ebff79bd2ac207990d8"} Nov 24 09:50:01 crc kubenswrapper[4563]: I1124 09:50:01.011249 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vndt9/must-gather-gxrkl" podStartSLOduration=2.298852391 podStartE2EDuration="8.011231611s" podCreationTimestamp="2025-11-24 09:49:53 +0000 UTC" firstStartedPulling="2025-11-24 09:49:54.621079227 +0000 UTC m=+2771.880056673" lastFinishedPulling="2025-11-24 09:50:00.333458446 +0000 UTC m=+2777.592435893" observedRunningTime="2025-11-24 09:50:01.006995364 +0000 UTC m=+2778.265972811" watchObservedRunningTime="2025-11-24 09:50:01.011231611 +0000 UTC m=+2778.270209058" Nov 24 09:50:03 crc kubenswrapper[4563]: I1124 09:50:03.398802 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vndt9/crc-debug-mbbf5"] Nov 24 09:50:03 crc kubenswrapper[4563]: I1124 09:50:03.401050 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/crc-debug-mbbf5" Nov 24 09:50:03 crc kubenswrapper[4563]: I1124 09:50:03.402885 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vndt9"/"default-dockercfg-95dhj" Nov 24 09:50:03 crc kubenswrapper[4563]: I1124 09:50:03.476362 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vvd6\" (UniqueName: \"kubernetes.io/projected/df19eebe-6c70-4704-a8c1-36289ba9a440-kube-api-access-2vvd6\") pod \"crc-debug-mbbf5\" (UID: \"df19eebe-6c70-4704-a8c1-36289ba9a440\") " pod="openshift-must-gather-vndt9/crc-debug-mbbf5" Nov 24 09:50:03 crc kubenswrapper[4563]: I1124 09:50:03.476697 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df19eebe-6c70-4704-a8c1-36289ba9a440-host\") pod \"crc-debug-mbbf5\" (UID: \"df19eebe-6c70-4704-a8c1-36289ba9a440\") " pod="openshift-must-gather-vndt9/crc-debug-mbbf5" Nov 24 09:50:03 crc kubenswrapper[4563]: I1124 09:50:03.578804 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vvd6\" (UniqueName: \"kubernetes.io/projected/df19eebe-6c70-4704-a8c1-36289ba9a440-kube-api-access-2vvd6\") pod \"crc-debug-mbbf5\" (UID: \"df19eebe-6c70-4704-a8c1-36289ba9a440\") " pod="openshift-must-gather-vndt9/crc-debug-mbbf5" Nov 24 09:50:03 crc kubenswrapper[4563]: I1124 09:50:03.578880 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df19eebe-6c70-4704-a8c1-36289ba9a440-host\") pod \"crc-debug-mbbf5\" (UID: \"df19eebe-6c70-4704-a8c1-36289ba9a440\") " pod="openshift-must-gather-vndt9/crc-debug-mbbf5" Nov 24 09:50:03 crc kubenswrapper[4563]: I1124 09:50:03.579033 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df19eebe-6c70-4704-a8c1-36289ba9a440-host\") pod \"crc-debug-mbbf5\" (UID: \"df19eebe-6c70-4704-a8c1-36289ba9a440\") " pod="openshift-must-gather-vndt9/crc-debug-mbbf5" Nov 24 09:50:03 crc kubenswrapper[4563]: I1124 09:50:03.604278 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vvd6\" (UniqueName: \"kubernetes.io/projected/df19eebe-6c70-4704-a8c1-36289ba9a440-kube-api-access-2vvd6\") pod \"crc-debug-mbbf5\" (UID: \"df19eebe-6c70-4704-a8c1-36289ba9a440\") " pod="openshift-must-gather-vndt9/crc-debug-mbbf5" Nov 24 09:50:03 crc kubenswrapper[4563]: I1124 09:50:03.718897 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/crc-debug-mbbf5" Nov 24 09:50:03 crc kubenswrapper[4563]: W1124 09:50:03.747106 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf19eebe_6c70_4704_a8c1_36289ba9a440.slice/crio-67042a602c581d6d41728c7738b54355cb20e769874d9793ec5d6e3667683117 WatchSource:0}: Error finding container 67042a602c581d6d41728c7738b54355cb20e769874d9793ec5d6e3667683117: Status 404 returned error can't find the container with id 67042a602c581d6d41728c7738b54355cb20e769874d9793ec5d6e3667683117 Nov 24 09:50:04 crc kubenswrapper[4563]: I1124 09:50:04.023574 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vndt9/crc-debug-mbbf5" event={"ID":"df19eebe-6c70-4704-a8c1-36289ba9a440","Type":"ContainerStarted","Data":"67042a602c581d6d41728c7738b54355cb20e769874d9793ec5d6e3667683117"} Nov 24 09:50:13 crc kubenswrapper[4563]: I1124 09:50:13.116899 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vndt9/crc-debug-mbbf5" event={"ID":"df19eebe-6c70-4704-a8c1-36289ba9a440","Type":"ContainerStarted","Data":"84ea453f1116c3c4adee82d73c15de223d8095f1f85f360d9a13705822d1678a"} Nov 24 09:50:13 crc kubenswrapper[4563]: I1124 09:50:13.134387 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vndt9/crc-debug-mbbf5" podStartSLOduration=1.209523313 podStartE2EDuration="10.134372396s" podCreationTimestamp="2025-11-24 09:50:03 +0000 UTC" firstStartedPulling="2025-11-24 09:50:03.749343442 +0000 UTC m=+2781.008320889" lastFinishedPulling="2025-11-24 09:50:12.674192525 +0000 UTC m=+2789.933169972" observedRunningTime="2025-11-24 09:50:13.129724463 +0000 UTC m=+2790.388701910" watchObservedRunningTime="2025-11-24 09:50:13.134372396 +0000 UTC m=+2790.393349843" Nov 24 09:50:43 crc kubenswrapper[4563]: I1124 09:50:43.316494 4563 generic.go:334] "Generic (PLEG): container finished" podID="df19eebe-6c70-4704-a8c1-36289ba9a440" containerID="84ea453f1116c3c4adee82d73c15de223d8095f1f85f360d9a13705822d1678a" exitCode=0 Nov 24 09:50:43 crc kubenswrapper[4563]: I1124 09:50:43.316589 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vndt9/crc-debug-mbbf5" event={"ID":"df19eebe-6c70-4704-a8c1-36289ba9a440","Type":"ContainerDied","Data":"84ea453f1116c3c4adee82d73c15de223d8095f1f85f360d9a13705822d1678a"} Nov 24 09:50:44 crc kubenswrapper[4563]: I1124 09:50:44.395401 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/crc-debug-mbbf5" Nov 24 09:50:44 crc kubenswrapper[4563]: I1124 09:50:44.416588 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vndt9/crc-debug-mbbf5"] Nov 24 09:50:44 crc kubenswrapper[4563]: I1124 09:50:44.421509 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vndt9/crc-debug-mbbf5"] Nov 24 09:50:44 crc kubenswrapper[4563]: I1124 09:50:44.520200 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df19eebe-6c70-4704-a8c1-36289ba9a440-host\") pod \"df19eebe-6c70-4704-a8c1-36289ba9a440\" (UID: \"df19eebe-6c70-4704-a8c1-36289ba9a440\") " Nov 24 09:50:44 crc kubenswrapper[4563]: I1124 09:50:44.520320 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vvd6\" (UniqueName: \"kubernetes.io/projected/df19eebe-6c70-4704-a8c1-36289ba9a440-kube-api-access-2vvd6\") pod \"df19eebe-6c70-4704-a8c1-36289ba9a440\" (UID: \"df19eebe-6c70-4704-a8c1-36289ba9a440\") " Nov 24 09:50:44 crc kubenswrapper[4563]: I1124 09:50:44.520315 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df19eebe-6c70-4704-a8c1-36289ba9a440-host" (OuterVolumeSpecName: "host") pod "df19eebe-6c70-4704-a8c1-36289ba9a440" (UID: "df19eebe-6c70-4704-a8c1-36289ba9a440"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:50:44 crc kubenswrapper[4563]: I1124 09:50:44.520840 4563 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df19eebe-6c70-4704-a8c1-36289ba9a440-host\") on node \"crc\" DevicePath \"\"" Nov 24 09:50:44 crc kubenswrapper[4563]: I1124 09:50:44.524307 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df19eebe-6c70-4704-a8c1-36289ba9a440-kube-api-access-2vvd6" (OuterVolumeSpecName: "kube-api-access-2vvd6") pod "df19eebe-6c70-4704-a8c1-36289ba9a440" (UID: "df19eebe-6c70-4704-a8c1-36289ba9a440"). InnerVolumeSpecName "kube-api-access-2vvd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:50:44 crc kubenswrapper[4563]: I1124 09:50:44.622186 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vvd6\" (UniqueName: \"kubernetes.io/projected/df19eebe-6c70-4704-a8c1-36289ba9a440-kube-api-access-2vvd6\") on node \"crc\" DevicePath \"\"" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.063293 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df19eebe-6c70-4704-a8c1-36289ba9a440" path="/var/lib/kubelet/pods/df19eebe-6c70-4704-a8c1-36289ba9a440/volumes" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.329624 4563 scope.go:117] "RemoveContainer" containerID="84ea453f1116c3c4adee82d73c15de223d8095f1f85f360d9a13705822d1678a" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.329669 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/crc-debug-mbbf5" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.528021 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vndt9/crc-debug-lps9h"] Nov 24 09:50:45 crc kubenswrapper[4563]: E1124 09:50:45.528424 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df19eebe-6c70-4704-a8c1-36289ba9a440" containerName="container-00" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.528437 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="df19eebe-6c70-4704-a8c1-36289ba9a440" containerName="container-00" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.528626 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="df19eebe-6c70-4704-a8c1-36289ba9a440" containerName="container-00" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.529191 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/crc-debug-lps9h" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.530789 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vndt9"/"default-dockercfg-95dhj" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.637218 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvnnl\" (UniqueName: \"kubernetes.io/projected/57b4fbce-39fa-4782-a011-d594c9c472a6-kube-api-access-tvnnl\") pod \"crc-debug-lps9h\" (UID: \"57b4fbce-39fa-4782-a011-d594c9c472a6\") " pod="openshift-must-gather-vndt9/crc-debug-lps9h" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.637417 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57b4fbce-39fa-4782-a011-d594c9c472a6-host\") pod \"crc-debug-lps9h\" (UID: \"57b4fbce-39fa-4782-a011-d594c9c472a6\") " pod="openshift-must-gather-vndt9/crc-debug-lps9h" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.738893 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57b4fbce-39fa-4782-a011-d594c9c472a6-host\") pod \"crc-debug-lps9h\" (UID: \"57b4fbce-39fa-4782-a011-d594c9c472a6\") " pod="openshift-must-gather-vndt9/crc-debug-lps9h" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.738978 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvnnl\" (UniqueName: \"kubernetes.io/projected/57b4fbce-39fa-4782-a011-d594c9c472a6-kube-api-access-tvnnl\") pod \"crc-debug-lps9h\" (UID: \"57b4fbce-39fa-4782-a011-d594c9c472a6\") " pod="openshift-must-gather-vndt9/crc-debug-lps9h" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.739029 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57b4fbce-39fa-4782-a011-d594c9c472a6-host\") pod \"crc-debug-lps9h\" (UID: \"57b4fbce-39fa-4782-a011-d594c9c472a6\") " pod="openshift-must-gather-vndt9/crc-debug-lps9h" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.753417 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvnnl\" (UniqueName: \"kubernetes.io/projected/57b4fbce-39fa-4782-a011-d594c9c472a6-kube-api-access-tvnnl\") pod \"crc-debug-lps9h\" (UID: \"57b4fbce-39fa-4782-a011-d594c9c472a6\") " pod="openshift-must-gather-vndt9/crc-debug-lps9h" Nov 24 09:50:45 crc kubenswrapper[4563]: I1124 09:50:45.841281 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/crc-debug-lps9h" Nov 24 09:50:45 crc kubenswrapper[4563]: W1124 09:50:45.860575 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57b4fbce_39fa_4782_a011_d594c9c472a6.slice/crio-d994eedad79ee91c507aee65f6c75c347ab6a203b118120a1d0a66aacc917fb0 WatchSource:0}: Error finding container d994eedad79ee91c507aee65f6c75c347ab6a203b118120a1d0a66aacc917fb0: Status 404 returned error can't find the container with id d994eedad79ee91c507aee65f6c75c347ab6a203b118120a1d0a66aacc917fb0 Nov 24 09:50:46 crc kubenswrapper[4563]: I1124 09:50:46.337664 4563 generic.go:334] "Generic (PLEG): container finished" podID="57b4fbce-39fa-4782-a011-d594c9c472a6" containerID="5b165adb085809131dbe2a6b92ddccbc5d064c1ae2edeb0ac81e0923214c708a" exitCode=0 Nov 24 09:50:46 crc kubenswrapper[4563]: I1124 09:50:46.337738 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vndt9/crc-debug-lps9h" event={"ID":"57b4fbce-39fa-4782-a011-d594c9c472a6","Type":"ContainerDied","Data":"5b165adb085809131dbe2a6b92ddccbc5d064c1ae2edeb0ac81e0923214c708a"} Nov 24 09:50:46 crc kubenswrapper[4563]: I1124 09:50:46.337939 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vndt9/crc-debug-lps9h" event={"ID":"57b4fbce-39fa-4782-a011-d594c9c472a6","Type":"ContainerStarted","Data":"d994eedad79ee91c507aee65f6c75c347ab6a203b118120a1d0a66aacc917fb0"} Nov 24 09:50:46 crc kubenswrapper[4563]: I1124 09:50:46.732213 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vndt9/crc-debug-lps9h"] Nov 24 09:50:46 crc kubenswrapper[4563]: I1124 09:50:46.739247 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vndt9/crc-debug-lps9h"] Nov 24 09:50:47 crc kubenswrapper[4563]: I1124 09:50:47.415107 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/crc-debug-lps9h" Nov 24 09:50:47 crc kubenswrapper[4563]: I1124 09:50:47.565331 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57b4fbce-39fa-4782-a011-d594c9c472a6-host\") pod \"57b4fbce-39fa-4782-a011-d594c9c472a6\" (UID: \"57b4fbce-39fa-4782-a011-d594c9c472a6\") " Nov 24 09:50:47 crc kubenswrapper[4563]: I1124 09:50:47.565414 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvnnl\" (UniqueName: \"kubernetes.io/projected/57b4fbce-39fa-4782-a011-d594c9c472a6-kube-api-access-tvnnl\") pod \"57b4fbce-39fa-4782-a011-d594c9c472a6\" (UID: \"57b4fbce-39fa-4782-a011-d594c9c472a6\") " Nov 24 09:50:47 crc kubenswrapper[4563]: I1124 09:50:47.565441 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57b4fbce-39fa-4782-a011-d594c9c472a6-host" (OuterVolumeSpecName: "host") pod "57b4fbce-39fa-4782-a011-d594c9c472a6" (UID: "57b4fbce-39fa-4782-a011-d594c9c472a6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:50:47 crc kubenswrapper[4563]: I1124 09:50:47.565703 4563 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57b4fbce-39fa-4782-a011-d594c9c472a6-host\") on node \"crc\" DevicePath \"\"" Nov 24 09:50:47 crc kubenswrapper[4563]: I1124 09:50:47.569370 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b4fbce-39fa-4782-a011-d594c9c472a6-kube-api-access-tvnnl" (OuterVolumeSpecName: "kube-api-access-tvnnl") pod "57b4fbce-39fa-4782-a011-d594c9c472a6" (UID: "57b4fbce-39fa-4782-a011-d594c9c472a6"). InnerVolumeSpecName "kube-api-access-tvnnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:50:47 crc kubenswrapper[4563]: I1124 09:50:47.667185 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvnnl\" (UniqueName: \"kubernetes.io/projected/57b4fbce-39fa-4782-a011-d594c9c472a6-kube-api-access-tvnnl\") on node \"crc\" DevicePath \"\"" Nov 24 09:50:47 crc kubenswrapper[4563]: I1124 09:50:47.841049 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vndt9/crc-debug-zkd24"] Nov 24 09:50:47 crc kubenswrapper[4563]: E1124 09:50:47.841525 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b4fbce-39fa-4782-a011-d594c9c472a6" containerName="container-00" Nov 24 09:50:47 crc kubenswrapper[4563]: I1124 09:50:47.841536 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b4fbce-39fa-4782-a011-d594c9c472a6" containerName="container-00" Nov 24 09:50:47 crc kubenswrapper[4563]: I1124 09:50:47.841741 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b4fbce-39fa-4782-a011-d594c9c472a6" containerName="container-00" Nov 24 09:50:47 crc kubenswrapper[4563]: I1124 09:50:47.842282 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/crc-debug-zkd24" Nov 24 09:50:47 crc kubenswrapper[4563]: I1124 09:50:47.971607 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4nn5\" (UniqueName: \"kubernetes.io/projected/f6282c23-c8b5-42a6-86d0-69f59a4a1fbe-kube-api-access-v4nn5\") pod \"crc-debug-zkd24\" (UID: \"f6282c23-c8b5-42a6-86d0-69f59a4a1fbe\") " pod="openshift-must-gather-vndt9/crc-debug-zkd24" Nov 24 09:50:47 crc kubenswrapper[4563]: I1124 09:50:47.971730 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6282c23-c8b5-42a6-86d0-69f59a4a1fbe-host\") pod \"crc-debug-zkd24\" (UID: \"f6282c23-c8b5-42a6-86d0-69f59a4a1fbe\") " pod="openshift-must-gather-vndt9/crc-debug-zkd24" Nov 24 09:50:48 crc kubenswrapper[4563]: I1124 09:50:48.073141 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4nn5\" (UniqueName: \"kubernetes.io/projected/f6282c23-c8b5-42a6-86d0-69f59a4a1fbe-kube-api-access-v4nn5\") pod \"crc-debug-zkd24\" (UID: \"f6282c23-c8b5-42a6-86d0-69f59a4a1fbe\") " pod="openshift-must-gather-vndt9/crc-debug-zkd24" Nov 24 09:50:48 crc kubenswrapper[4563]: I1124 09:50:48.073330 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6282c23-c8b5-42a6-86d0-69f59a4a1fbe-host\") pod \"crc-debug-zkd24\" (UID: \"f6282c23-c8b5-42a6-86d0-69f59a4a1fbe\") " pod="openshift-must-gather-vndt9/crc-debug-zkd24" Nov 24 09:50:48 crc kubenswrapper[4563]: I1124 09:50:48.073527 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6282c23-c8b5-42a6-86d0-69f59a4a1fbe-host\") pod \"crc-debug-zkd24\" (UID: \"f6282c23-c8b5-42a6-86d0-69f59a4a1fbe\") " pod="openshift-must-gather-vndt9/crc-debug-zkd24" Nov 24 09:50:48 crc kubenswrapper[4563]: I1124 09:50:48.086528 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4nn5\" (UniqueName: \"kubernetes.io/projected/f6282c23-c8b5-42a6-86d0-69f59a4a1fbe-kube-api-access-v4nn5\") pod \"crc-debug-zkd24\" (UID: \"f6282c23-c8b5-42a6-86d0-69f59a4a1fbe\") " pod="openshift-must-gather-vndt9/crc-debug-zkd24" Nov 24 09:50:48 crc kubenswrapper[4563]: I1124 09:50:48.154940 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/crc-debug-zkd24" Nov 24 09:50:48 crc kubenswrapper[4563]: W1124 09:50:48.174726 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6282c23_c8b5_42a6_86d0_69f59a4a1fbe.slice/crio-8fb3f23ff45ba484f94cebe292d2cf5d324c0ca5ea0b4f95c5dbf7098b7d9d28 WatchSource:0}: Error finding container 8fb3f23ff45ba484f94cebe292d2cf5d324c0ca5ea0b4f95c5dbf7098b7d9d28: Status 404 returned error can't find the container with id 8fb3f23ff45ba484f94cebe292d2cf5d324c0ca5ea0b4f95c5dbf7098b7d9d28 Nov 24 09:50:48 crc kubenswrapper[4563]: I1124 09:50:48.353060 4563 generic.go:334] "Generic (PLEG): container finished" podID="f6282c23-c8b5-42a6-86d0-69f59a4a1fbe" containerID="1cf5b34d3e70b206b8a5fe3f3695f23729f2dcd02d9fdf679d4d64ca55ff2ce4" exitCode=0 Nov 24 09:50:48 crc kubenswrapper[4563]: I1124 09:50:48.353185 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vndt9/crc-debug-zkd24" event={"ID":"f6282c23-c8b5-42a6-86d0-69f59a4a1fbe","Type":"ContainerDied","Data":"1cf5b34d3e70b206b8a5fe3f3695f23729f2dcd02d9fdf679d4d64ca55ff2ce4"} Nov 24 09:50:48 crc kubenswrapper[4563]: I1124 09:50:48.353300 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vndt9/crc-debug-zkd24" event={"ID":"f6282c23-c8b5-42a6-86d0-69f59a4a1fbe","Type":"ContainerStarted","Data":"8fb3f23ff45ba484f94cebe292d2cf5d324c0ca5ea0b4f95c5dbf7098b7d9d28"} Nov 24 09:50:48 crc kubenswrapper[4563]: I1124 09:50:48.354975 4563 scope.go:117] "RemoveContainer" containerID="5b165adb085809131dbe2a6b92ddccbc5d064c1ae2edeb0ac81e0923214c708a" Nov 24 09:50:48 crc kubenswrapper[4563]: I1124 09:50:48.355073 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/crc-debug-lps9h" Nov 24 09:50:48 crc kubenswrapper[4563]: I1124 09:50:48.378521 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vndt9/crc-debug-zkd24"] Nov 24 09:50:48 crc kubenswrapper[4563]: I1124 09:50:48.385029 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vndt9/crc-debug-zkd24"] Nov 24 09:50:49 crc kubenswrapper[4563]: I1124 09:50:49.062911 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b4fbce-39fa-4782-a011-d594c9c472a6" path="/var/lib/kubelet/pods/57b4fbce-39fa-4782-a011-d594c9c472a6/volumes" Nov 24 09:50:49 crc kubenswrapper[4563]: I1124 09:50:49.431957 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/crc-debug-zkd24" Nov 24 09:50:49 crc kubenswrapper[4563]: I1124 09:50:49.495252 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6282c23-c8b5-42a6-86d0-69f59a4a1fbe-host\") pod \"f6282c23-c8b5-42a6-86d0-69f59a4a1fbe\" (UID: \"f6282c23-c8b5-42a6-86d0-69f59a4a1fbe\") " Nov 24 09:50:49 crc kubenswrapper[4563]: I1124 09:50:49.495358 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6282c23-c8b5-42a6-86d0-69f59a4a1fbe-host" (OuterVolumeSpecName: "host") pod "f6282c23-c8b5-42a6-86d0-69f59a4a1fbe" (UID: "f6282c23-c8b5-42a6-86d0-69f59a4a1fbe"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:50:49 crc kubenswrapper[4563]: I1124 09:50:49.495446 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4nn5\" (UniqueName: \"kubernetes.io/projected/f6282c23-c8b5-42a6-86d0-69f59a4a1fbe-kube-api-access-v4nn5\") pod \"f6282c23-c8b5-42a6-86d0-69f59a4a1fbe\" (UID: \"f6282c23-c8b5-42a6-86d0-69f59a4a1fbe\") " Nov 24 09:50:49 crc kubenswrapper[4563]: I1124 09:50:49.495887 4563 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6282c23-c8b5-42a6-86d0-69f59a4a1fbe-host\") on node \"crc\" DevicePath \"\"" Nov 24 09:50:49 crc kubenswrapper[4563]: I1124 09:50:49.499995 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6282c23-c8b5-42a6-86d0-69f59a4a1fbe-kube-api-access-v4nn5" (OuterVolumeSpecName: "kube-api-access-v4nn5") pod "f6282c23-c8b5-42a6-86d0-69f59a4a1fbe" (UID: "f6282c23-c8b5-42a6-86d0-69f59a4a1fbe"). InnerVolumeSpecName "kube-api-access-v4nn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:50:49 crc kubenswrapper[4563]: I1124 09:50:49.596969 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4nn5\" (UniqueName: \"kubernetes.io/projected/f6282c23-c8b5-42a6-86d0-69f59a4a1fbe-kube-api-access-v4nn5\") on node \"crc\" DevicePath \"\"" Nov 24 09:50:50 crc kubenswrapper[4563]: I1124 09:50:50.371851 4563 scope.go:117] "RemoveContainer" containerID="1cf5b34d3e70b206b8a5fe3f3695f23729f2dcd02d9fdf679d4d64ca55ff2ce4" Nov 24 09:50:50 crc kubenswrapper[4563]: I1124 09:50:50.371878 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/crc-debug-zkd24" Nov 24 09:50:51 crc kubenswrapper[4563]: I1124 09:50:51.063144 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6282c23-c8b5-42a6-86d0-69f59a4a1fbe" path="/var/lib/kubelet/pods/f6282c23-c8b5-42a6-86d0-69f59a4a1fbe/volumes" Nov 24 09:50:59 crc kubenswrapper[4563]: I1124 09:50:59.709523 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b876f5fb4-sx5lp_29588100-1198-4e82-a1c3-87d27b71aa65/barbican-api/0.log" Nov 24 09:50:59 crc kubenswrapper[4563]: I1124 09:50:59.817285 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b876f5fb4-sx5lp_29588100-1198-4e82-a1c3-87d27b71aa65/barbican-api-log/0.log" Nov 24 09:50:59 crc kubenswrapper[4563]: I1124 09:50:59.872551 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64cbb46c46-7fsmj_17d8ec67-c825-4ab0-bd77-cd610ff6838e/barbican-keystone-listener/0.log" Nov 24 09:50:59 crc kubenswrapper[4563]: I1124 09:50:59.900422 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64cbb46c46-7fsmj_17d8ec67-c825-4ab0-bd77-cd610ff6838e/barbican-keystone-listener-log/0.log" Nov 24 09:51:00 crc kubenswrapper[4563]: I1124 09:51:00.030603 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d65456589-92q4d_9d88e05b-2750-483f-a0a3-5169e4cc919c/barbican-worker-log/0.log" Nov 24 09:51:00 crc kubenswrapper[4563]: I1124 09:51:00.045674 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d65456589-92q4d_9d88e05b-2750-483f-a0a3-5169e4cc919c/barbican-worker/0.log" Nov 24 09:51:00 crc kubenswrapper[4563]: I1124 09:51:00.185170 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2_1c07ab91-ccc4-46d0-b15c-0d20675fc19a/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:00 crc kubenswrapper[4563]: I1124 09:51:00.223388 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_86acf291-2839-49f7-aaf3-33ba6e0cae2e/ceilometer-central-agent/0.log" Nov 24 09:51:00 crc kubenswrapper[4563]: I1124 09:51:00.281408 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_86acf291-2839-49f7-aaf3-33ba6e0cae2e/ceilometer-notification-agent/0.log" Nov 24 09:51:00 crc kubenswrapper[4563]: I1124 09:51:00.327342 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_86acf291-2839-49f7-aaf3-33ba6e0cae2e/proxy-httpd/0.log" Nov 24 09:51:00 crc kubenswrapper[4563]: I1124 09:51:00.373056 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_86acf291-2839-49f7-aaf3-33ba6e0cae2e/sg-core/0.log" Nov 24 09:51:00 crc kubenswrapper[4563]: I1124 09:51:00.473895 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dd5367ce-55f6-4685-b414-4ef54ce7df7a/cinder-api/0.log" Nov 24 09:51:00 crc kubenswrapper[4563]: I1124 09:51:00.514557 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dd5367ce-55f6-4685-b414-4ef54ce7df7a/cinder-api-log/0.log" Nov 24 09:51:00 crc kubenswrapper[4563]: I1124 09:51:00.626962 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3/cinder-scheduler/0.log" Nov 24 09:51:00 crc kubenswrapper[4563]: I1124 09:51:00.663510 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3/probe/0.log" Nov 24 09:51:00 crc kubenswrapper[4563]: I1124 09:51:00.709968 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj_ea6d045d-1394-436f-9329-9f3a9d10610b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:00 crc kubenswrapper[4563]: I1124 09:51:00.861191 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-64qvz_dd51baae-5c71-4421-9cc1-1095c3bba2e9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:00 crc kubenswrapper[4563]: I1124 09:51:00.905718 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-64858ddbd7-mtmng_34b79993-dd96-4594-a00f-3ca0dd207e62/init/0.log" Nov 24 09:51:01 crc kubenswrapper[4563]: I1124 09:51:01.070334 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-64858ddbd7-mtmng_34b79993-dd96-4594-a00f-3ca0dd207e62/init/0.log" Nov 24 09:51:01 crc kubenswrapper[4563]: I1124 09:51:01.104873 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-64858ddbd7-mtmng_34b79993-dd96-4594-a00f-3ca0dd207e62/dnsmasq-dns/0.log" Nov 24 09:51:01 crc kubenswrapper[4563]: I1124 09:51:01.109444 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-srkpz_75442289-63cd-4b6c-b86d-70ab08ae8dc2/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:01 crc kubenswrapper[4563]: I1124 09:51:01.244367 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f6624aa1-6acc-43a1-944e-20a77c1b09d9/glance-httpd/0.log" Nov 24 09:51:01 crc kubenswrapper[4563]: I1124 09:51:01.284775 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f6624aa1-6acc-43a1-944e-20a77c1b09d9/glance-log/0.log" Nov 24 09:51:01 crc kubenswrapper[4563]: I1124 09:51:01.390028 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b57487f8-d1f8-4f97-b92e-7385ecc88074/glance-log/0.log" Nov 24 09:51:01 crc kubenswrapper[4563]: I1124 09:51:01.390766 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b57487f8-d1f8-4f97-b92e-7385ecc88074/glance-httpd/0.log" Nov 24 09:51:01 crc kubenswrapper[4563]: I1124 09:51:01.532473 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d999bbd6-cqj6s_a7688cb4-70ea-43e4-85f2-6b96f972538f/horizon/0.log" Nov 24 09:51:01 crc kubenswrapper[4563]: I1124 09:51:01.642413 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr_aa741fe2-400c-479c-bfb3-0d5273b064e2/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:01 crc kubenswrapper[4563]: I1124 09:51:01.764162 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d999bbd6-cqj6s_a7688cb4-70ea-43e4-85f2-6b96f972538f/horizon-log/0.log" Nov 24 09:51:01 crc kubenswrapper[4563]: I1124 09:51:01.777272 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4bvkr_424c4e83-a3c6-4eea-958e-e0cf83f20fdf/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:02 crc kubenswrapper[4563]: I1124 09:51:02.004239 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_86c40cc3-1c2a-47db-9ed2-eb746b65ac4b/kube-state-metrics/0.log" Nov 24 09:51:02 crc kubenswrapper[4563]: I1124 09:51:02.044962 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6565cb8596-rhwtd_d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1/keystone-api/0.log" Nov 24 09:51:02 crc kubenswrapper[4563]: I1124 09:51:02.178247 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-z8q62_96a07419-7337-47f5-89aa-233e06eec048/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:02 crc kubenswrapper[4563]: I1124 09:51:02.482558 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85d84cd957-f2sp9_5c5b560e-1f0c-4469-8455-1aec5e7653bd/neutron-httpd/0.log" Nov 24 09:51:02 crc kubenswrapper[4563]: I1124 09:51:02.518195 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85d84cd957-f2sp9_5c5b560e-1f0c-4469-8455-1aec5e7653bd/neutron-api/0.log" Nov 24 09:51:02 crc kubenswrapper[4563]: I1124 09:51:02.658478 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2_5c13fe46-9855-4291-b685-df5de9abafa7/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:03 crc kubenswrapper[4563]: I1124 09:51:03.099916 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a7e5bec0-7f94-410f-9344-aaa699457924/nova-cell0-conductor-conductor/0.log" Nov 24 09:51:03 crc kubenswrapper[4563]: I1124 09:51:03.158339 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6/nova-api-log/0.log" Nov 24 09:51:03 crc kubenswrapper[4563]: I1124 09:51:03.181739 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6/nova-api-api/0.log" Nov 24 09:51:03 crc kubenswrapper[4563]: I1124 09:51:03.370817 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_89f4e9e2-2f29-4076-a9e3-8513bfd1e07e/nova-cell1-conductor-conductor/0.log" Nov 24 09:51:03 crc kubenswrapper[4563]: I1124 09:51:03.422022 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_15b68912-1886-4162-88c8-02a37d34c54a/nova-cell1-novncproxy-novncproxy/0.log" Nov 24 09:51:03 crc kubenswrapper[4563]: I1124 09:51:03.553786 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-nx4nm_49ccd723-5c1a-4763-9eb4-5aed7651bad5/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:03 crc kubenswrapper[4563]: I1124 09:51:03.748278 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c17fb818-0e53-4655-89ac-a1bb9022b5f8/nova-metadata-log/0.log" Nov 24 09:51:04 crc kubenswrapper[4563]: I1124 09:51:04.022412 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_38360dce-8f0e-42b1-ba4c-d13036b2794a/nova-scheduler-scheduler/0.log" Nov 24 09:51:04 crc kubenswrapper[4563]: I1124 09:51:04.045446 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2c2b6368-21fd-4c13-b008-5fe4be95dc8d/mysql-bootstrap/0.log" Nov 24 09:51:04 crc kubenswrapper[4563]: I1124 09:51:04.224380 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2c2b6368-21fd-4c13-b008-5fe4be95dc8d/galera/0.log" Nov 24 09:51:04 crc kubenswrapper[4563]: I1124 09:51:04.245313 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2c2b6368-21fd-4c13-b008-5fe4be95dc8d/mysql-bootstrap/0.log" Nov 24 09:51:04 crc kubenswrapper[4563]: I1124 09:51:04.408535 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b0de325e-9aea-4ee2-9cc4-093f3d8d3f65/mysql-bootstrap/0.log" Nov 24 09:51:04 crc kubenswrapper[4563]: I1124 09:51:04.457459 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c17fb818-0e53-4655-89ac-a1bb9022b5f8/nova-metadata-metadata/0.log" Nov 24 09:51:04 crc kubenswrapper[4563]: I1124 09:51:04.574844 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b0de325e-9aea-4ee2-9cc4-093f3d8d3f65/mysql-bootstrap/0.log" Nov 24 09:51:04 crc kubenswrapper[4563]: I1124 09:51:04.590625 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b0de325e-9aea-4ee2-9cc4-093f3d8d3f65/galera/0.log" Nov 24 09:51:04 crc kubenswrapper[4563]: I1124 09:51:04.668229 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_9b1ff524-d9dc-4433-a21c-f6d00e3b89d4/openstackclient/0.log" Nov 24 09:51:04 crc kubenswrapper[4563]: I1124 09:51:04.769302 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cblxg_d5d57856-0858-4ef6-86b1-282d4bc462be/openstack-network-exporter/0.log" Nov 24 09:51:04 crc kubenswrapper[4563]: I1124 09:51:04.868817 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6z24f_01d7f46a-ff30-4904-a63a-8d41cea54dd7/ovsdb-server-init/0.log" Nov 24 09:51:05 crc kubenswrapper[4563]: I1124 09:51:05.135609 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6z24f_01d7f46a-ff30-4904-a63a-8d41cea54dd7/ovsdb-server-init/0.log" Nov 24 09:51:05 crc kubenswrapper[4563]: I1124 09:51:05.139478 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6z24f_01d7f46a-ff30-4904-a63a-8d41cea54dd7/ovs-vswitchd/0.log" Nov 24 09:51:05 crc kubenswrapper[4563]: I1124 09:51:05.198905 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6z24f_01d7f46a-ff30-4904-a63a-8d41cea54dd7/ovsdb-server/0.log" Nov 24 09:51:05 crc kubenswrapper[4563]: I1124 09:51:05.338201 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-qtfnl_241e854a-eb29-4933-98be-bad6b9295260/ovn-controller/0.log" Nov 24 09:51:05 crc kubenswrapper[4563]: I1124 09:51:05.396353 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-tt4mm_5ab3a15a-af4c-43a3-9d3a-1515e2c8228b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:05 crc kubenswrapper[4563]: I1124 09:51:05.509996 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c78e2b4d-f2bf-435e-b163-c9415021f43c/openstack-network-exporter/0.log" Nov 24 09:51:05 crc kubenswrapper[4563]: I1124 09:51:05.557893 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c78e2b4d-f2bf-435e-b163-c9415021f43c/ovn-northd/0.log" Nov 24 09:51:05 crc kubenswrapper[4563]: I1124 09:51:05.711994 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3399d213-46c4-42c1-9d69-26246c4ed771/openstack-network-exporter/0.log" Nov 24 09:51:05 crc kubenswrapper[4563]: I1124 09:51:05.728628 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3399d213-46c4-42c1-9d69-26246c4ed771/ovsdbserver-nb/0.log" Nov 24 09:51:05 crc kubenswrapper[4563]: I1124 09:51:05.831217 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_04022879-4b41-4e57-ae94-a3517d382e7d/openstack-network-exporter/0.log" Nov 24 09:51:05 crc kubenswrapper[4563]: I1124 09:51:05.870020 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_04022879-4b41-4e57-ae94-a3517d382e7d/ovsdbserver-sb/0.log" Nov 24 09:51:06 crc kubenswrapper[4563]: I1124 09:51:06.015904 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-748c4bdffd-w974j_9103bb32-e426-4c4b-ade8-d3430cf5ca11/placement-api/0.log" Nov 24 09:51:06 crc kubenswrapper[4563]: I1124 09:51:06.046538 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-748c4bdffd-w974j_9103bb32-e426-4c4b-ade8-d3430cf5ca11/placement-log/0.log" Nov 24 09:51:06 crc kubenswrapper[4563]: I1124 09:51:06.159234 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62faa658-2c71-4afe-9fc2-4d9fd0079928/setup-container/0.log" Nov 24 09:51:06 crc kubenswrapper[4563]: I1124 09:51:06.399686 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62faa658-2c71-4afe-9fc2-4d9fd0079928/setup-container/0.log" Nov 24 09:51:06 crc kubenswrapper[4563]: I1124 09:51:06.399781 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_45003ba2-beec-43e7-9248-42c517ed3bf7/setup-container/0.log" Nov 24 09:51:06 crc kubenswrapper[4563]: I1124 09:51:06.430670 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62faa658-2c71-4afe-9fc2-4d9fd0079928/rabbitmq/0.log" Nov 24 09:51:06 crc kubenswrapper[4563]: I1124 09:51:06.558416 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_45003ba2-beec-43e7-9248-42c517ed3bf7/setup-container/0.log" Nov 24 09:51:06 crc kubenswrapper[4563]: I1124 09:51:06.590307 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_45003ba2-beec-43e7-9248-42c517ed3bf7/rabbitmq/0.log" Nov 24 09:51:06 crc kubenswrapper[4563]: I1124 09:51:06.615594 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw_bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:06 crc kubenswrapper[4563]: I1124 09:51:06.788757 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-tmkfw_96dc1f15-b31a-4eb6-91e7-35b341f1347a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:06 crc kubenswrapper[4563]: I1124 09:51:06.871865 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx_c3cdb156-f67f-4dd2-b04d-9fb263802321/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:06 crc kubenswrapper[4563]: I1124 09:51:06.971253 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-mbq9m_5f2b4785-aae5-4031-9e66-c3601ef67b6a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.080098 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-4lklp_77470e38-d989-4832-8cac-4b2f1a8f2d14/ssh-known-hosts-edpm-deployment/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.241992 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5487cdb76f-rn9rx_64984138-1ff3-4d53-b4b9-e301fc5f2f80/proxy-server/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.285329 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5487cdb76f-rn9rx_64984138-1ff3-4d53-b4b9-e301fc5f2f80/proxy-httpd/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.356907 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mgbkw_874b4a65-f3cc-4bb7-9634-0a464700f823/swift-ring-rebalance/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.448650 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/account-auditor/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.525353 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/account-reaper/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.541682 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/account-replicator/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.671731 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/account-server/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.677973 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/container-auditor/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.719363 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/container-server/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.739346 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/container-replicator/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.877458 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/container-updater/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.881011 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/object-auditor/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.900088 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/object-expirer/0.log" Nov 24 09:51:07 crc kubenswrapper[4563]: I1124 09:51:07.909083 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/object-replicator/0.log" Nov 24 09:51:08 crc kubenswrapper[4563]: I1124 09:51:08.025605 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/object-updater/0.log" Nov 24 09:51:08 crc kubenswrapper[4563]: I1124 09:51:08.043838 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/object-server/0.log" Nov 24 09:51:08 crc kubenswrapper[4563]: I1124 09:51:08.094028 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/swift-recon-cron/0.log" Nov 24 09:51:08 crc kubenswrapper[4563]: I1124 09:51:08.128585 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/rsync/0.log" Nov 24 09:51:08 crc kubenswrapper[4563]: I1124 09:51:08.328820 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z_0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:08 crc kubenswrapper[4563]: I1124 09:51:08.336095 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d15e06ff-83ac-44e9-aebe-9756628722e6/tempest-tests-tempest-tests-runner/0.log" Nov 24 09:51:08 crc kubenswrapper[4563]: I1124 09:51:08.474021 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_b8a9086a-b375-485e-990a-27b6e4832c77/test-operator-logs-container/0.log" Nov 24 09:51:08 crc kubenswrapper[4563]: I1124 09:51:08.584289 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm_72560768-c189-4eaa-9128-486ec369275b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:51:08 crc kubenswrapper[4563]: I1124 09:51:08.987057 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:51:08 crc kubenswrapper[4563]: I1124 09:51:08.987110 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:51:13 crc kubenswrapper[4563]: I1124 09:51:13.786694 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qzbwh"] Nov 24 09:51:13 crc kubenswrapper[4563]: E1124 09:51:13.787753 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6282c23-c8b5-42a6-86d0-69f59a4a1fbe" containerName="container-00" Nov 24 09:51:13 crc kubenswrapper[4563]: I1124 09:51:13.787769 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6282c23-c8b5-42a6-86d0-69f59a4a1fbe" containerName="container-00" Nov 24 09:51:13 crc kubenswrapper[4563]: I1124 09:51:13.787993 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6282c23-c8b5-42a6-86d0-69f59a4a1fbe" containerName="container-00" Nov 24 09:51:13 crc kubenswrapper[4563]: I1124 09:51:13.789621 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:13 crc kubenswrapper[4563]: I1124 09:51:13.796550 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzbwh"] Nov 24 09:51:13 crc kubenswrapper[4563]: I1124 09:51:13.913837 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqdmv\" (UniqueName: \"kubernetes.io/projected/ee0711f0-5fe8-48d2-a50b-805157edd591-kube-api-access-bqdmv\") pod \"redhat-marketplace-qzbwh\" (UID: \"ee0711f0-5fe8-48d2-a50b-805157edd591\") " pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:13 crc kubenswrapper[4563]: I1124 09:51:13.913958 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0711f0-5fe8-48d2-a50b-805157edd591-utilities\") pod \"redhat-marketplace-qzbwh\" (UID: \"ee0711f0-5fe8-48d2-a50b-805157edd591\") " pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:13 crc kubenswrapper[4563]: I1124 09:51:13.914272 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0711f0-5fe8-48d2-a50b-805157edd591-catalog-content\") pod \"redhat-marketplace-qzbwh\" (UID: \"ee0711f0-5fe8-48d2-a50b-805157edd591\") " pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:14 crc kubenswrapper[4563]: I1124 09:51:14.015618 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0711f0-5fe8-48d2-a50b-805157edd591-catalog-content\") pod \"redhat-marketplace-qzbwh\" (UID: \"ee0711f0-5fe8-48d2-a50b-805157edd591\") " pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:14 crc kubenswrapper[4563]: I1124 09:51:14.015684 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqdmv\" (UniqueName: \"kubernetes.io/projected/ee0711f0-5fe8-48d2-a50b-805157edd591-kube-api-access-bqdmv\") pod \"redhat-marketplace-qzbwh\" (UID: \"ee0711f0-5fe8-48d2-a50b-805157edd591\") " pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:14 crc kubenswrapper[4563]: I1124 09:51:14.015740 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0711f0-5fe8-48d2-a50b-805157edd591-utilities\") pod \"redhat-marketplace-qzbwh\" (UID: \"ee0711f0-5fe8-48d2-a50b-805157edd591\") " pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:14 crc kubenswrapper[4563]: I1124 09:51:14.016019 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0711f0-5fe8-48d2-a50b-805157edd591-catalog-content\") pod \"redhat-marketplace-qzbwh\" (UID: \"ee0711f0-5fe8-48d2-a50b-805157edd591\") " pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:14 crc kubenswrapper[4563]: I1124 09:51:14.016226 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0711f0-5fe8-48d2-a50b-805157edd591-utilities\") pod \"redhat-marketplace-qzbwh\" (UID: \"ee0711f0-5fe8-48d2-a50b-805157edd591\") " pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:14 crc kubenswrapper[4563]: I1124 09:51:14.033034 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqdmv\" (UniqueName: \"kubernetes.io/projected/ee0711f0-5fe8-48d2-a50b-805157edd591-kube-api-access-bqdmv\") pod \"redhat-marketplace-qzbwh\" (UID: \"ee0711f0-5fe8-48d2-a50b-805157edd591\") " pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:14 crc kubenswrapper[4563]: I1124 09:51:14.113083 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:14 crc kubenswrapper[4563]: I1124 09:51:14.602067 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzbwh"] Nov 24 09:51:15 crc kubenswrapper[4563]: I1124 09:51:15.571147 4563 generic.go:334] "Generic (PLEG): container finished" podID="ee0711f0-5fe8-48d2-a50b-805157edd591" containerID="a9ce22d33c583d3ed0728825843cd453ecdec17d97464f02f41a96a9ab88b5ac" exitCode=0 Nov 24 09:51:15 crc kubenswrapper[4563]: I1124 09:51:15.571422 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzbwh" event={"ID":"ee0711f0-5fe8-48d2-a50b-805157edd591","Type":"ContainerDied","Data":"a9ce22d33c583d3ed0728825843cd453ecdec17d97464f02f41a96a9ab88b5ac"} Nov 24 09:51:15 crc kubenswrapper[4563]: I1124 09:51:15.571484 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzbwh" event={"ID":"ee0711f0-5fe8-48d2-a50b-805157edd591","Type":"ContainerStarted","Data":"5b09da3de422ba5568b9d2f43891859578968d37871b52d5c62f9a282afd4094"} Nov 24 09:51:16 crc kubenswrapper[4563]: I1124 09:51:16.581662 4563 generic.go:334] "Generic (PLEG): container finished" podID="ee0711f0-5fe8-48d2-a50b-805157edd591" containerID="25dc550c5b6c68d16610ac5bdb3db0a400fa914dec2c75324f1afbb6f5cb8c2f" exitCode=0 Nov 24 09:51:16 crc kubenswrapper[4563]: I1124 09:51:16.581753 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzbwh" event={"ID":"ee0711f0-5fe8-48d2-a50b-805157edd591","Type":"ContainerDied","Data":"25dc550c5b6c68d16610ac5bdb3db0a400fa914dec2c75324f1afbb6f5cb8c2f"} Nov 24 09:51:17 crc kubenswrapper[4563]: I1124 09:51:17.593461 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzbwh" event={"ID":"ee0711f0-5fe8-48d2-a50b-805157edd591","Type":"ContainerStarted","Data":"53f57b648bb8af1af922deb38a51bf4359bf649fe3aeea466ab31b1923be95dd"} Nov 24 09:51:17 crc kubenswrapper[4563]: I1124 09:51:17.617371 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qzbwh" podStartSLOduration=2.818448343 podStartE2EDuration="4.617328856s" podCreationTimestamp="2025-11-24 09:51:13 +0000 UTC" firstStartedPulling="2025-11-24 09:51:15.572900441 +0000 UTC m=+2852.831877889" lastFinishedPulling="2025-11-24 09:51:17.371780955 +0000 UTC m=+2854.630758402" observedRunningTime="2025-11-24 09:51:17.607296176 +0000 UTC m=+2854.866273623" watchObservedRunningTime="2025-11-24 09:51:17.617328856 +0000 UTC m=+2854.876321894" Nov 24 09:51:18 crc kubenswrapper[4563]: I1124 09:51:18.063485 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_bdeec6b1-05d8-4275-839f-a02e22e26f61/memcached/0.log" Nov 24 09:51:24 crc kubenswrapper[4563]: I1124 09:51:24.113559 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:24 crc kubenswrapper[4563]: I1124 09:51:24.114356 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:24 crc kubenswrapper[4563]: I1124 09:51:24.161807 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:24 crc kubenswrapper[4563]: I1124 09:51:24.688020 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:24 crc kubenswrapper[4563]: I1124 09:51:24.735575 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzbwh"] Nov 24 09:51:26 crc kubenswrapper[4563]: I1124 09:51:26.664593 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qzbwh" podUID="ee0711f0-5fe8-48d2-a50b-805157edd591" containerName="registry-server" containerID="cri-o://53f57b648bb8af1af922deb38a51bf4359bf649fe3aeea466ab31b1923be95dd" gracePeriod=2 Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.101026 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.296485 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqdmv\" (UniqueName: \"kubernetes.io/projected/ee0711f0-5fe8-48d2-a50b-805157edd591-kube-api-access-bqdmv\") pod \"ee0711f0-5fe8-48d2-a50b-805157edd591\" (UID: \"ee0711f0-5fe8-48d2-a50b-805157edd591\") " Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.296743 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0711f0-5fe8-48d2-a50b-805157edd591-utilities\") pod \"ee0711f0-5fe8-48d2-a50b-805157edd591\" (UID: \"ee0711f0-5fe8-48d2-a50b-805157edd591\") " Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.296848 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0711f0-5fe8-48d2-a50b-805157edd591-catalog-content\") pod \"ee0711f0-5fe8-48d2-a50b-805157edd591\" (UID: \"ee0711f0-5fe8-48d2-a50b-805157edd591\") " Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.297399 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0711f0-5fe8-48d2-a50b-805157edd591-utilities" (OuterVolumeSpecName: "utilities") pod "ee0711f0-5fe8-48d2-a50b-805157edd591" (UID: "ee0711f0-5fe8-48d2-a50b-805157edd591"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.297567 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0711f0-5fe8-48d2-a50b-805157edd591-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.301921 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0711f0-5fe8-48d2-a50b-805157edd591-kube-api-access-bqdmv" (OuterVolumeSpecName: "kube-api-access-bqdmv") pod "ee0711f0-5fe8-48d2-a50b-805157edd591" (UID: "ee0711f0-5fe8-48d2-a50b-805157edd591"). InnerVolumeSpecName "kube-api-access-bqdmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.311125 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0711f0-5fe8-48d2-a50b-805157edd591-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee0711f0-5fe8-48d2-a50b-805157edd591" (UID: "ee0711f0-5fe8-48d2-a50b-805157edd591"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.400083 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0711f0-5fe8-48d2-a50b-805157edd591-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.400128 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqdmv\" (UniqueName: \"kubernetes.io/projected/ee0711f0-5fe8-48d2-a50b-805157edd591-kube-api-access-bqdmv\") on node \"crc\" DevicePath \"\"" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.678727 4563 generic.go:334] "Generic (PLEG): container finished" podID="ee0711f0-5fe8-48d2-a50b-805157edd591" containerID="53f57b648bb8af1af922deb38a51bf4359bf649fe3aeea466ab31b1923be95dd" exitCode=0 Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.678818 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzbwh" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.678823 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzbwh" event={"ID":"ee0711f0-5fe8-48d2-a50b-805157edd591","Type":"ContainerDied","Data":"53f57b648bb8af1af922deb38a51bf4359bf649fe3aeea466ab31b1923be95dd"} Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.678972 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzbwh" event={"ID":"ee0711f0-5fe8-48d2-a50b-805157edd591","Type":"ContainerDied","Data":"5b09da3de422ba5568b9d2f43891859578968d37871b52d5c62f9a282afd4094"} Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.678999 4563 scope.go:117] "RemoveContainer" containerID="53f57b648bb8af1af922deb38a51bf4359bf649fe3aeea466ab31b1923be95dd" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.707734 4563 scope.go:117] "RemoveContainer" containerID="25dc550c5b6c68d16610ac5bdb3db0a400fa914dec2c75324f1afbb6f5cb8c2f" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.718341 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzbwh"] Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.725680 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzbwh"] Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.744379 4563 scope.go:117] "RemoveContainer" containerID="a9ce22d33c583d3ed0728825843cd453ecdec17d97464f02f41a96a9ab88b5ac" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.763405 4563 scope.go:117] "RemoveContainer" containerID="53f57b648bb8af1af922deb38a51bf4359bf649fe3aeea466ab31b1923be95dd" Nov 24 09:51:27 crc kubenswrapper[4563]: E1124 09:51:27.763823 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53f57b648bb8af1af922deb38a51bf4359bf649fe3aeea466ab31b1923be95dd\": container with ID starting with 53f57b648bb8af1af922deb38a51bf4359bf649fe3aeea466ab31b1923be95dd not found: ID does not exist" containerID="53f57b648bb8af1af922deb38a51bf4359bf649fe3aeea466ab31b1923be95dd" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.763857 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53f57b648bb8af1af922deb38a51bf4359bf649fe3aeea466ab31b1923be95dd"} err="failed to get container status \"53f57b648bb8af1af922deb38a51bf4359bf649fe3aeea466ab31b1923be95dd\": rpc error: code = NotFound desc = could not find container \"53f57b648bb8af1af922deb38a51bf4359bf649fe3aeea466ab31b1923be95dd\": container with ID starting with 53f57b648bb8af1af922deb38a51bf4359bf649fe3aeea466ab31b1923be95dd not found: ID does not exist" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.763880 4563 scope.go:117] "RemoveContainer" containerID="25dc550c5b6c68d16610ac5bdb3db0a400fa914dec2c75324f1afbb6f5cb8c2f" Nov 24 09:51:27 crc kubenswrapper[4563]: E1124 09:51:27.764181 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25dc550c5b6c68d16610ac5bdb3db0a400fa914dec2c75324f1afbb6f5cb8c2f\": container with ID starting with 25dc550c5b6c68d16610ac5bdb3db0a400fa914dec2c75324f1afbb6f5cb8c2f not found: ID does not exist" containerID="25dc550c5b6c68d16610ac5bdb3db0a400fa914dec2c75324f1afbb6f5cb8c2f" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.764203 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25dc550c5b6c68d16610ac5bdb3db0a400fa914dec2c75324f1afbb6f5cb8c2f"} err="failed to get container status \"25dc550c5b6c68d16610ac5bdb3db0a400fa914dec2c75324f1afbb6f5cb8c2f\": rpc error: code = NotFound desc = could not find container \"25dc550c5b6c68d16610ac5bdb3db0a400fa914dec2c75324f1afbb6f5cb8c2f\": container with ID starting with 25dc550c5b6c68d16610ac5bdb3db0a400fa914dec2c75324f1afbb6f5cb8c2f not found: ID does not exist" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.764218 4563 scope.go:117] "RemoveContainer" containerID="a9ce22d33c583d3ed0728825843cd453ecdec17d97464f02f41a96a9ab88b5ac" Nov 24 09:51:27 crc kubenswrapper[4563]: E1124 09:51:27.764487 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ce22d33c583d3ed0728825843cd453ecdec17d97464f02f41a96a9ab88b5ac\": container with ID starting with a9ce22d33c583d3ed0728825843cd453ecdec17d97464f02f41a96a9ab88b5ac not found: ID does not exist" containerID="a9ce22d33c583d3ed0728825843cd453ecdec17d97464f02f41a96a9ab88b5ac" Nov 24 09:51:27 crc kubenswrapper[4563]: I1124 09:51:27.764526 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ce22d33c583d3ed0728825843cd453ecdec17d97464f02f41a96a9ab88b5ac"} err="failed to get container status \"a9ce22d33c583d3ed0728825843cd453ecdec17d97464f02f41a96a9ab88b5ac\": rpc error: code = NotFound desc = could not find container \"a9ce22d33c583d3ed0728825843cd453ecdec17d97464f02f41a96a9ab88b5ac\": container with ID starting with a9ce22d33c583d3ed0728825843cd453ecdec17d97464f02f41a96a9ab88b5ac not found: ID does not exist" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.065461 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee0711f0-5fe8-48d2-a50b-805157edd591" path="/var/lib/kubelet/pods/ee0711f0-5fe8-48d2-a50b-805157edd591/volumes" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.080806 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d_8d314d9f-2d34-4e4c-899e-a113c55ad0df/util/0.log" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.199098 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d_8d314d9f-2d34-4e4c-899e-a113c55ad0df/util/0.log" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.228079 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d_8d314d9f-2d34-4e4c-899e-a113c55ad0df/pull/0.log" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.254193 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d_8d314d9f-2d34-4e4c-899e-a113c55ad0df/pull/0.log" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.389364 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d_8d314d9f-2d34-4e4c-899e-a113c55ad0df/util/0.log" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.411555 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d_8d314d9f-2d34-4e4c-899e-a113c55ad0df/extract/0.log" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.430865 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d_8d314d9f-2d34-4e4c-899e-a113c55ad0df/pull/0.log" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.518282 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7768f8c84f-glf4s_f81c148e-bf8e-4b57-895e-f2c11411cf7a/kube-rbac-proxy/0.log" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.585508 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6d8fd67bf7-jnx9f_a62a6523-e592-437f-b3ba-320e24f619dc/kube-rbac-proxy/0.log" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.619898 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7768f8c84f-glf4s_f81c148e-bf8e-4b57-895e-f2c11411cf7a/manager/0.log" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.732860 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6d8fd67bf7-jnx9f_a62a6523-e592-437f-b3ba-320e24f619dc/manager/0.log" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.756842 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-56dfb6b67f-77wgb_17904228-d0e5-489c-a965-5cba44f3b3f2/kube-rbac-proxy/0.log" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.770027 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-56dfb6b67f-77wgb_17904228-d0e5-489c-a965-5cba44f3b3f2/manager/0.log" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.913489 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8667fbf6f6-k9wzp_77d539d7-5235-4576-a276-8247c5824020/kube-rbac-proxy/0.log" Nov 24 09:51:29 crc kubenswrapper[4563]: I1124 09:51:29.985847 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8667fbf6f6-k9wzp_77d539d7-5235-4576-a276-8247c5824020/manager/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.053307 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-bf4c6585d-tnxst_b4f4311c-5634-4bae-8659-5efa662f0562/kube-rbac-proxy/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.084199 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-bf4c6585d-tnxst_b4f4311c-5634-4bae-8659-5efa662f0562/manager/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.150615 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d86b44686-4x76m_70a63634-9a9f-46b3-af05-9dc02c0a03e1/kube-rbac-proxy/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.229486 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d86b44686-4x76m_70a63634-9a9f-46b3-af05-9dc02c0a03e1/manager/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.302311 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-769d9c7585-4f5hq_68eeb4a0-b192-4e6a-b02b-f34415b29316/kube-rbac-proxy/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.449093 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-769d9c7585-4f5hq_68eeb4a0-b192-4e6a-b02b-f34415b29316/manager/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.462807 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5c75d7c94b-ltqbl_26aa13a3-737a-457f-9d46-29018cfccd1e/manager/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.465554 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5c75d7c94b-ltqbl_26aa13a3-737a-457f-9d46-29018cfccd1e/kube-rbac-proxy/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.587359 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7879fb76fd-4tv9l_ebed0d67-0bac-4d1f-a2d0-2e367d78d157/kube-rbac-proxy/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.673734 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7879fb76fd-4tv9l_ebed0d67-0bac-4d1f-a2d0-2e367d78d157/manager/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.752865 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7bb88cb858-44jfn_13a3e7f4-4c3d-46e7-a7ee-612f9ac17bc7/kube-rbac-proxy/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.817251 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6f8c5b86cb-94tjk_a30aea9a-f4c8-42a3-89bb-af9ffef55544/kube-rbac-proxy/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.818732 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7bb88cb858-44jfn_13a3e7f4-4c3d-46e7-a7ee-612f9ac17bc7/manager/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.957337 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6f8c5b86cb-94tjk_a30aea9a-f4c8-42a3-89bb-af9ffef55544/manager/0.log" Nov 24 09:51:30 crc kubenswrapper[4563]: I1124 09:51:30.983778 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-66b7d6f598-fffcm_ffcb9e74-1697-402a-b77b-5a3ecc832759/kube-rbac-proxy/0.log" Nov 24 09:51:31 crc kubenswrapper[4563]: I1124 09:51:31.016186 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-66b7d6f598-fffcm_ffcb9e74-1697-402a-b77b-5a3ecc832759/manager/0.log" Nov 24 09:51:31 crc kubenswrapper[4563]: I1124 09:51:31.124931 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-86d796d84d-vkltr_c089c738-65b8-46e2-91c9-59b962081c05/kube-rbac-proxy/0.log" Nov 24 09:51:31 crc kubenswrapper[4563]: I1124 09:51:31.212709 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-86d796d84d-vkltr_c089c738-65b8-46e2-91c9-59b962081c05/manager/0.log" Nov 24 09:51:31 crc kubenswrapper[4563]: I1124 09:51:31.291256 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6fdc856c5d-h78s9_71d78263-9c76-454f-8b9f-1392c9fcfc2f/kube-rbac-proxy/0.log" Nov 24 09:51:31 crc kubenswrapper[4563]: I1124 09:51:31.308214 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6fdc856c5d-h78s9_71d78263-9c76-454f-8b9f-1392c9fcfc2f/manager/0.log" Nov 24 09:51:31 crc kubenswrapper[4563]: I1124 09:51:31.398304 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-79d88dcd444qmtr_974a1619-7c48-46d6-b639-5f965c6b747a/kube-rbac-proxy/0.log" Nov 24 09:51:31 crc kubenswrapper[4563]: I1124 09:51:31.464957 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-79d88dcd444qmtr_974a1619-7c48-46d6-b639-5f965c6b747a/manager/0.log" Nov 24 09:51:31 crc kubenswrapper[4563]: I1124 09:51:31.566854 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6cb9dc54f8-m7w2q_cdec1b8b-630a-452a-b4d9-3cd42ef204c7/kube-rbac-proxy/0.log" Nov 24 09:51:31 crc kubenswrapper[4563]: I1124 09:51:31.724581 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-8486c7f98b-xz8g7_56c65669-5fad-40b3-aec8-b459c3e6b0f8/kube-rbac-proxy/0.log" Nov 24 09:51:31 crc kubenswrapper[4563]: I1124 09:51:31.910888 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-8486c7f98b-xz8g7_56c65669-5fad-40b3-aec8-b459c3e6b0f8/operator/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.002055 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-5f5wm_0079c598-0bc4-4809-9813-0aa163a961a1/registry-server/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.192774 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5bdf4f7f7f-6n5jh_9fb1ddc7-1195-412e-93ed-4799bc756bae/kube-rbac-proxy/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.308987 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5bdf4f7f7f-6n5jh_9fb1ddc7-1195-412e-93ed-4799bc756bae/manager/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.376589 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-6dc664666c-6flr8_6a018387-ddf9-40f3-a421-d1a760581c8f/manager/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.377745 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-6dc664666c-6flr8_6a018387-ddf9-40f3-a421-d1a760581c8f/kube-rbac-proxy/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.537002 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-g64g6_00f5e4f8-193c-48df-b29f-8f359f263a5a/operator/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.573885 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6cb9dc54f8-m7w2q_cdec1b8b-630a-452a-b4d9-3cd42ef204c7/manager/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.617436 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-799cb6ffd6-wck8j_31e8d237-829e-47b0-8a2c-8e316a37dc78/kube-rbac-proxy/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.687904 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-799cb6ffd6-wck8j_31e8d237-829e-47b0-8a2c-8e316a37dc78/manager/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.735419 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7798859c74-z5b6f_9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f/kube-rbac-proxy/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.778195 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7798859c74-z5b6f_9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f/manager/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.860758 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8464cf66df-chpfj_238f517b-0e10-411c-8b3c-c6bdbe261159/kube-rbac-proxy/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.899572 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8464cf66df-chpfj_238f517b-0e10-411c-8b3c-c6bdbe261159/manager/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.933542 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7cd4fb6f79-qhzw4_d5e12170-5cc0-4f8f-89d7-c64f38f2226e/kube-rbac-proxy/0.log" Nov 24 09:51:32 crc kubenswrapper[4563]: I1124 09:51:32.970710 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7cd4fb6f79-qhzw4_d5e12170-5cc0-4f8f-89d7-c64f38f2226e/manager/0.log" Nov 24 09:51:38 crc kubenswrapper[4563]: I1124 09:51:38.987683 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:51:38 crc kubenswrapper[4563]: I1124 09:51:38.988366 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:51:45 crc kubenswrapper[4563]: I1124 09:51:45.632044 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-z4lb4_378c7d30-dd7c-4aa5-83cf-7caca587f283/control-plane-machine-set-operator/0.log" Nov 24 09:51:45 crc kubenswrapper[4563]: I1124 09:51:45.784018 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bxdlb_48157749-8872-4c5b-b119-efe27cfd887e/kube-rbac-proxy/0.log" Nov 24 09:51:45 crc kubenswrapper[4563]: I1124 09:51:45.806203 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bxdlb_48157749-8872-4c5b-b119-efe27cfd887e/machine-api-operator/0.log" Nov 24 09:51:55 crc kubenswrapper[4563]: I1124 09:51:55.174121 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-bpkp8_271399d1-6304-4dcd-a3df-6c543849329e/cert-manager-controller/0.log" Nov 24 09:51:55 crc kubenswrapper[4563]: I1124 09:51:55.340056 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-p6b8w_a32e3f9f-14d2-44fb-ba5a-9ede6e568643/cert-manager-webhook/0.log" Nov 24 09:51:55 crc kubenswrapper[4563]: I1124 09:51:55.357139 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-mfw5p_dbb0b52d-058f-46a3-8342-811bd3f5b495/cert-manager-cainjector/0.log" Nov 24 09:52:05 crc kubenswrapper[4563]: I1124 09:52:05.068331 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-tqw7b_89298ce0-9e0a-4351-96a9-4b69233c7ba0/nmstate-console-plugin/0.log" Nov 24 09:52:05 crc kubenswrapper[4563]: I1124 09:52:05.207299 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jbdfx_71fe428e-199c-422c-8911-79d2a7d27ab1/nmstate-handler/0.log" Nov 24 09:52:05 crc kubenswrapper[4563]: I1124 09:52:05.253856 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-xf2lb_d0a1ac8a-df66-4ac5-9aed-a2001c905f21/kube-rbac-proxy/0.log" Nov 24 09:52:05 crc kubenswrapper[4563]: I1124 09:52:05.286149 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-xf2lb_d0a1ac8a-df66-4ac5-9aed-a2001c905f21/nmstate-metrics/0.log" Nov 24 09:52:05 crc kubenswrapper[4563]: I1124 09:52:05.365810 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-ppw2s_e4241626-5fdc-4620-9ffd-6bdc19046a33/nmstate-operator/0.log" Nov 24 09:52:05 crc kubenswrapper[4563]: I1124 09:52:05.453583 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-qz8gz_564d8757-3a04-48f3-b3a2-109930f83a10/nmstate-webhook/0.log" Nov 24 09:52:08 crc kubenswrapper[4563]: I1124 09:52:08.987525 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:52:08 crc kubenswrapper[4563]: I1124 09:52:08.988093 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:52:08 crc kubenswrapper[4563]: I1124 09:52:08.988157 4563 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:52:08 crc kubenswrapper[4563]: I1124 09:52:08.988752 4563 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"61bede8c7960fe7299ad0cfd68d688b281ba1733220ddc88fefa904a4696a51c"} pod="openshift-machine-config-operator/machine-config-daemon-stlxr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 09:52:08 crc kubenswrapper[4563]: I1124 09:52:08.988814 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" containerID="cri-o://61bede8c7960fe7299ad0cfd68d688b281ba1733220ddc88fefa904a4696a51c" gracePeriod=600 Nov 24 09:52:10 crc kubenswrapper[4563]: I1124 09:52:10.029432 4563 generic.go:334] "Generic (PLEG): container finished" podID="3b2bfe55-8989-49b3-bb61-e28189447627" containerID="61bede8c7960fe7299ad0cfd68d688b281ba1733220ddc88fefa904a4696a51c" exitCode=0 Nov 24 09:52:10 crc kubenswrapper[4563]: I1124 09:52:10.029546 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerDied","Data":"61bede8c7960fe7299ad0cfd68d688b281ba1733220ddc88fefa904a4696a51c"} Nov 24 09:52:10 crc kubenswrapper[4563]: I1124 09:52:10.030191 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108"} Nov 24 09:52:10 crc kubenswrapper[4563]: I1124 09:52:10.030235 4563 scope.go:117] "RemoveContainer" containerID="4200cb1ce270ffd495cadabfc73fa2c475afed29e7c972624b9f87e354f98610" Nov 24 09:52:17 crc kubenswrapper[4563]: I1124 09:52:17.342578 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-7rql8_e3ae4470-f488-4bc7-b9e0-a37903b5400a/kube-rbac-proxy/0.log" Nov 24 09:52:17 crc kubenswrapper[4563]: I1124 09:52:17.475537 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-7rql8_e3ae4470-f488-4bc7-b9e0-a37903b5400a/controller/0.log" Nov 24 09:52:17 crc kubenswrapper[4563]: I1124 09:52:17.510448 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-frr-files/0.log" Nov 24 09:52:17 crc kubenswrapper[4563]: I1124 09:52:17.716888 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-metrics/0.log" Nov 24 09:52:17 crc kubenswrapper[4563]: I1124 09:52:17.773033 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-reloader/0.log" Nov 24 09:52:17 crc kubenswrapper[4563]: I1124 09:52:17.777943 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-frr-files/0.log" Nov 24 09:52:17 crc kubenswrapper[4563]: I1124 09:52:17.784050 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-reloader/0.log" Nov 24 09:52:17 crc kubenswrapper[4563]: I1124 09:52:17.889722 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-frr-files/0.log" Nov 24 09:52:17 crc kubenswrapper[4563]: I1124 09:52:17.924651 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-metrics/0.log" Nov 24 09:52:17 crc kubenswrapper[4563]: I1124 09:52:17.946105 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-reloader/0.log" Nov 24 09:52:17 crc kubenswrapper[4563]: I1124 09:52:17.971496 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-metrics/0.log" Nov 24 09:52:18 crc kubenswrapper[4563]: I1124 09:52:18.107555 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-frr-files/0.log" Nov 24 09:52:18 crc kubenswrapper[4563]: I1124 09:52:18.112788 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-metrics/0.log" Nov 24 09:52:18 crc kubenswrapper[4563]: I1124 09:52:18.118302 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-reloader/0.log" Nov 24 09:52:18 crc kubenswrapper[4563]: I1124 09:52:18.170602 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/controller/0.log" Nov 24 09:52:18 crc kubenswrapper[4563]: I1124 09:52:18.279678 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/kube-rbac-proxy/0.log" Nov 24 09:52:18 crc kubenswrapper[4563]: I1124 09:52:18.287473 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/frr-metrics/0.log" Nov 24 09:52:18 crc kubenswrapper[4563]: I1124 09:52:18.321111 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/kube-rbac-proxy-frr/0.log" Nov 24 09:52:18 crc kubenswrapper[4563]: I1124 09:52:18.499677 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/reloader/0.log" Nov 24 09:52:18 crc kubenswrapper[4563]: I1124 09:52:18.525774 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-fchts_22afba2f-88ba-4b65-8f98-a024f676b896/frr-k8s-webhook-server/0.log" Nov 24 09:52:18 crc kubenswrapper[4563]: I1124 09:52:18.736609 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7cd8c86c8-w62w9_5784d9be-5a59-4204-829a-dc637bfb7d90/manager/0.log" Nov 24 09:52:18 crc kubenswrapper[4563]: I1124 09:52:18.852929 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-559746f898-fwz9n_29db6437-f6b7-4f7f-a855-33b7316b09f8/webhook-server/0.log" Nov 24 09:52:19 crc kubenswrapper[4563]: I1124 09:52:19.036322 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tddpp_3b0c98e8-df1b-485b-972d-2e2ff8103006/kube-rbac-proxy/0.log" Nov 24 09:52:19 crc kubenswrapper[4563]: I1124 09:52:19.424209 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/frr/0.log" Nov 24 09:52:19 crc kubenswrapper[4563]: I1124 09:52:19.514280 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tddpp_3b0c98e8-df1b-485b-972d-2e2ff8103006/speaker/0.log" Nov 24 09:52:29 crc kubenswrapper[4563]: I1124 09:52:29.564614 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc_e25e7e42-b065-4947-a1d0-3d641b371d06/util/0.log" Nov 24 09:52:29 crc kubenswrapper[4563]: I1124 09:52:29.738501 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc_e25e7e42-b065-4947-a1d0-3d641b371d06/util/0.log" Nov 24 09:52:29 crc kubenswrapper[4563]: I1124 09:52:29.753701 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc_e25e7e42-b065-4947-a1d0-3d641b371d06/pull/0.log" Nov 24 09:52:29 crc kubenswrapper[4563]: I1124 09:52:29.772985 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc_e25e7e42-b065-4947-a1d0-3d641b371d06/pull/0.log" Nov 24 09:52:29 crc kubenswrapper[4563]: I1124 09:52:29.927819 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc_e25e7e42-b065-4947-a1d0-3d641b371d06/util/0.log" Nov 24 09:52:29 crc kubenswrapper[4563]: I1124 09:52:29.928857 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc_e25e7e42-b065-4947-a1d0-3d641b371d06/pull/0.log" Nov 24 09:52:29 crc kubenswrapper[4563]: I1124 09:52:29.930830 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc_e25e7e42-b065-4947-a1d0-3d641b371d06/extract/0.log" Nov 24 09:52:30 crc kubenswrapper[4563]: I1124 09:52:30.068838 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sf9ml_83e0ce45-d845-49ab-b393-af085b920737/extract-utilities/0.log" Nov 24 09:52:30 crc kubenswrapper[4563]: I1124 09:52:30.198166 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sf9ml_83e0ce45-d845-49ab-b393-af085b920737/extract-utilities/0.log" Nov 24 09:52:30 crc kubenswrapper[4563]: I1124 09:52:30.250397 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sf9ml_83e0ce45-d845-49ab-b393-af085b920737/extract-content/0.log" Nov 24 09:52:30 crc kubenswrapper[4563]: I1124 09:52:30.259198 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sf9ml_83e0ce45-d845-49ab-b393-af085b920737/extract-content/0.log" Nov 24 09:52:30 crc kubenswrapper[4563]: I1124 09:52:30.430501 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sf9ml_83e0ce45-d845-49ab-b393-af085b920737/extract-content/0.log" Nov 24 09:52:30 crc kubenswrapper[4563]: I1124 09:52:30.462483 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sf9ml_83e0ce45-d845-49ab-b393-af085b920737/extract-utilities/0.log" Nov 24 09:52:30 crc kubenswrapper[4563]: I1124 09:52:30.678616 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6cgfm_48f738ab-b1e0-4e09-baf3-3ed15d54151c/extract-utilities/0.log" Nov 24 09:52:30 crc kubenswrapper[4563]: I1124 09:52:30.818885 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sf9ml_83e0ce45-d845-49ab-b393-af085b920737/registry-server/0.log" Nov 24 09:52:30 crc kubenswrapper[4563]: I1124 09:52:30.869130 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6cgfm_48f738ab-b1e0-4e09-baf3-3ed15d54151c/extract-content/0.log" Nov 24 09:52:30 crc kubenswrapper[4563]: I1124 09:52:30.872690 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6cgfm_48f738ab-b1e0-4e09-baf3-3ed15d54151c/extract-content/0.log" Nov 24 09:52:30 crc kubenswrapper[4563]: I1124 09:52:30.895915 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6cgfm_48f738ab-b1e0-4e09-baf3-3ed15d54151c/extract-utilities/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.066461 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6cgfm_48f738ab-b1e0-4e09-baf3-3ed15d54151c/extract-utilities/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.071971 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6cgfm_48f738ab-b1e0-4e09-baf3-3ed15d54151c/extract-content/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.245920 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc_2e79caaa-09f9-4720-b80d-300d880d7e26/util/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.453164 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc_2e79caaa-09f9-4720-b80d-300d880d7e26/pull/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.461399 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc_2e79caaa-09f9-4720-b80d-300d880d7e26/util/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.496323 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc_2e79caaa-09f9-4720-b80d-300d880d7e26/pull/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.508223 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6cgfm_48f738ab-b1e0-4e09-baf3-3ed15d54151c/registry-server/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.651179 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc_2e79caaa-09f9-4720-b80d-300d880d7e26/util/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.656294 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc_2e79caaa-09f9-4720-b80d-300d880d7e26/pull/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.669285 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc_2e79caaa-09f9-4720-b80d-300d880d7e26/extract/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.802515 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-5pxln_157ed1a3-ea31-4a6b-8e91-2852d4c50600/marketplace-operator/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.803251 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nsq9v_cb0770b7-f971-4e96-ab32-1f41b4cd9885/extract-utilities/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.976281 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nsq9v_cb0770b7-f971-4e96-ab32-1f41b4cd9885/extract-utilities/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.989294 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nsq9v_cb0770b7-f971-4e96-ab32-1f41b4cd9885/extract-content/0.log" Nov 24 09:52:31 crc kubenswrapper[4563]: I1124 09:52:31.993120 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nsq9v_cb0770b7-f971-4e96-ab32-1f41b4cd9885/extract-content/0.log" Nov 24 09:52:32 crc kubenswrapper[4563]: I1124 09:52:32.137528 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nsq9v_cb0770b7-f971-4e96-ab32-1f41b4cd9885/extract-content/0.log" Nov 24 09:52:32 crc kubenswrapper[4563]: I1124 09:52:32.172559 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nsq9v_cb0770b7-f971-4e96-ab32-1f41b4cd9885/extract-utilities/0.log" Nov 24 09:52:32 crc kubenswrapper[4563]: I1124 09:52:32.231885 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nsq9v_cb0770b7-f971-4e96-ab32-1f41b4cd9885/registry-server/0.log" Nov 24 09:52:32 crc kubenswrapper[4563]: I1124 09:52:32.304981 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9lpl_e26141fd-4cfa-4726-ba65-1f3bb830411b/extract-utilities/0.log" Nov 24 09:52:32 crc kubenswrapper[4563]: I1124 09:52:32.455782 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9lpl_e26141fd-4cfa-4726-ba65-1f3bb830411b/extract-content/0.log" Nov 24 09:52:32 crc kubenswrapper[4563]: I1124 09:52:32.462340 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9lpl_e26141fd-4cfa-4726-ba65-1f3bb830411b/extract-utilities/0.log" Nov 24 09:52:32 crc kubenswrapper[4563]: I1124 09:52:32.499870 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9lpl_e26141fd-4cfa-4726-ba65-1f3bb830411b/extract-content/0.log" Nov 24 09:52:32 crc kubenswrapper[4563]: I1124 09:52:32.613182 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9lpl_e26141fd-4cfa-4726-ba65-1f3bb830411b/extract-utilities/0.log" Nov 24 09:52:32 crc kubenswrapper[4563]: I1124 09:52:32.647301 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9lpl_e26141fd-4cfa-4726-ba65-1f3bb830411b/extract-content/0.log" Nov 24 09:52:32 crc kubenswrapper[4563]: I1124 09:52:32.905518 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9lpl_e26141fd-4cfa-4726-ba65-1f3bb830411b/registry-server/0.log" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.465605 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p7vn5"] Nov 24 09:53:08 crc kubenswrapper[4563]: E1124 09:53:08.466657 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0711f0-5fe8-48d2-a50b-805157edd591" containerName="extract-content" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.466672 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0711f0-5fe8-48d2-a50b-805157edd591" containerName="extract-content" Nov 24 09:53:08 crc kubenswrapper[4563]: E1124 09:53:08.466697 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0711f0-5fe8-48d2-a50b-805157edd591" containerName="extract-utilities" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.466703 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0711f0-5fe8-48d2-a50b-805157edd591" containerName="extract-utilities" Nov 24 09:53:08 crc kubenswrapper[4563]: E1124 09:53:08.466715 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0711f0-5fe8-48d2-a50b-805157edd591" containerName="registry-server" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.466721 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0711f0-5fe8-48d2-a50b-805157edd591" containerName="registry-server" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.466954 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0711f0-5fe8-48d2-a50b-805157edd591" containerName="registry-server" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.468348 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.474294 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7vn5"] Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.609440 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-utilities\") pod \"certified-operators-p7vn5\" (UID: \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\") " pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.609499 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-catalog-content\") pod \"certified-operators-p7vn5\" (UID: \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\") " pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.609833 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5q7v\" (UniqueName: \"kubernetes.io/projected/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-kube-api-access-p5q7v\") pod \"certified-operators-p7vn5\" (UID: \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\") " pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.712210 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5q7v\" (UniqueName: \"kubernetes.io/projected/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-kube-api-access-p5q7v\") pod \"certified-operators-p7vn5\" (UID: \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\") " pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.712333 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-utilities\") pod \"certified-operators-p7vn5\" (UID: \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\") " pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.712366 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-catalog-content\") pod \"certified-operators-p7vn5\" (UID: \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\") " pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.712944 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-utilities\") pod \"certified-operators-p7vn5\" (UID: \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\") " pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.712972 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-catalog-content\") pod \"certified-operators-p7vn5\" (UID: \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\") " pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.729353 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5q7v\" (UniqueName: \"kubernetes.io/projected/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-kube-api-access-p5q7v\") pod \"certified-operators-p7vn5\" (UID: \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\") " pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:08 crc kubenswrapper[4563]: I1124 09:53:08.788857 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:09 crc kubenswrapper[4563]: I1124 09:53:09.233461 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p7vn5"] Nov 24 09:53:09 crc kubenswrapper[4563]: I1124 09:53:09.547176 4563 generic.go:334] "Generic (PLEG): container finished" podID="fb5645ca-0429-44c5-85dd-9aa1cca5aba2" containerID="70933930c2bda10f21115cd044aa2d67d861efa612772e652443a5322329ff31" exitCode=0 Nov 24 09:53:09 crc kubenswrapper[4563]: I1124 09:53:09.547339 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7vn5" event={"ID":"fb5645ca-0429-44c5-85dd-9aa1cca5aba2","Type":"ContainerDied","Data":"70933930c2bda10f21115cd044aa2d67d861efa612772e652443a5322329ff31"} Nov 24 09:53:09 crc kubenswrapper[4563]: I1124 09:53:09.547429 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7vn5" event={"ID":"fb5645ca-0429-44c5-85dd-9aa1cca5aba2","Type":"ContainerStarted","Data":"7ece76547e7bf38e5b9e96e285508001eeb47c8ab6427ab556e51cef8df50ee8"} Nov 24 09:53:10 crc kubenswrapper[4563]: I1124 09:53:10.557772 4563 generic.go:334] "Generic (PLEG): container finished" podID="fb5645ca-0429-44c5-85dd-9aa1cca5aba2" containerID="00a26e29f76dbabf0d0f8a0a46d8690d97b29c425f1d6a989580c1446d36aa90" exitCode=0 Nov 24 09:53:10 crc kubenswrapper[4563]: I1124 09:53:10.557989 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7vn5" event={"ID":"fb5645ca-0429-44c5-85dd-9aa1cca5aba2","Type":"ContainerDied","Data":"00a26e29f76dbabf0d0f8a0a46d8690d97b29c425f1d6a989580c1446d36aa90"} Nov 24 09:53:11 crc kubenswrapper[4563]: I1124 09:53:11.568429 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7vn5" event={"ID":"fb5645ca-0429-44c5-85dd-9aa1cca5aba2","Type":"ContainerStarted","Data":"625f18238c10176adea1c17579e52dac139cde9d4f01c69e6ee08a37c8665676"} Nov 24 09:53:11 crc kubenswrapper[4563]: I1124 09:53:11.591899 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p7vn5" podStartSLOduration=1.903611231 podStartE2EDuration="3.591818789s" podCreationTimestamp="2025-11-24 09:53:08 +0000 UTC" firstStartedPulling="2025-11-24 09:53:09.548671889 +0000 UTC m=+2966.807649326" lastFinishedPulling="2025-11-24 09:53:11.236879437 +0000 UTC m=+2968.495856884" observedRunningTime="2025-11-24 09:53:11.587267599 +0000 UTC m=+2968.846245046" watchObservedRunningTime="2025-11-24 09:53:11.591818789 +0000 UTC m=+2968.850796237" Nov 24 09:53:18 crc kubenswrapper[4563]: I1124 09:53:18.790033 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:18 crc kubenswrapper[4563]: I1124 09:53:18.790548 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:18 crc kubenswrapper[4563]: I1124 09:53:18.827542 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:19 crc kubenswrapper[4563]: I1124 09:53:19.665045 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:19 crc kubenswrapper[4563]: I1124 09:53:19.717696 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7vn5"] Nov 24 09:53:21 crc kubenswrapper[4563]: I1124 09:53:21.642913 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p7vn5" podUID="fb5645ca-0429-44c5-85dd-9aa1cca5aba2" containerName="registry-server" containerID="cri-o://625f18238c10176adea1c17579e52dac139cde9d4f01c69e6ee08a37c8665676" gracePeriod=2 Nov 24 09:53:21 crc kubenswrapper[4563]: E1124 09:53:21.731280 4563 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb5645ca_0429_44c5_85dd_9aa1cca5aba2.slice/crio-625f18238c10176adea1c17579e52dac139cde9d4f01c69e6ee08a37c8665676.scope\": RecentStats: unable to find data in memory cache]" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.016976 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.191583 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-catalog-content\") pod \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\" (UID: \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\") " Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.191748 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5q7v\" (UniqueName: \"kubernetes.io/projected/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-kube-api-access-p5q7v\") pod \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\" (UID: \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\") " Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.191854 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-utilities\") pod \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\" (UID: \"fb5645ca-0429-44c5-85dd-9aa1cca5aba2\") " Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.193282 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-utilities" (OuterVolumeSpecName: "utilities") pod "fb5645ca-0429-44c5-85dd-9aa1cca5aba2" (UID: "fb5645ca-0429-44c5-85dd-9aa1cca5aba2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.199720 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-kube-api-access-p5q7v" (OuterVolumeSpecName: "kube-api-access-p5q7v") pod "fb5645ca-0429-44c5-85dd-9aa1cca5aba2" (UID: "fb5645ca-0429-44c5-85dd-9aa1cca5aba2"). InnerVolumeSpecName "kube-api-access-p5q7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.228081 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb5645ca-0429-44c5-85dd-9aa1cca5aba2" (UID: "fb5645ca-0429-44c5-85dd-9aa1cca5aba2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.294750 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.294782 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5q7v\" (UniqueName: \"kubernetes.io/projected/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-kube-api-access-p5q7v\") on node \"crc\" DevicePath \"\"" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.294796 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb5645ca-0429-44c5-85dd-9aa1cca5aba2-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.652332 4563 generic.go:334] "Generic (PLEG): container finished" podID="fb5645ca-0429-44c5-85dd-9aa1cca5aba2" containerID="625f18238c10176adea1c17579e52dac139cde9d4f01c69e6ee08a37c8665676" exitCode=0 Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.652374 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7vn5" event={"ID":"fb5645ca-0429-44c5-85dd-9aa1cca5aba2","Type":"ContainerDied","Data":"625f18238c10176adea1c17579e52dac139cde9d4f01c69e6ee08a37c8665676"} Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.652403 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p7vn5" event={"ID":"fb5645ca-0429-44c5-85dd-9aa1cca5aba2","Type":"ContainerDied","Data":"7ece76547e7bf38e5b9e96e285508001eeb47c8ab6427ab556e51cef8df50ee8"} Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.652419 4563 scope.go:117] "RemoveContainer" containerID="625f18238c10176adea1c17579e52dac139cde9d4f01c69e6ee08a37c8665676" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.652544 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p7vn5" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.681931 4563 scope.go:117] "RemoveContainer" containerID="00a26e29f76dbabf0d0f8a0a46d8690d97b29c425f1d6a989580c1446d36aa90" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.687136 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p7vn5"] Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.697968 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p7vn5"] Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.701944 4563 scope.go:117] "RemoveContainer" containerID="70933930c2bda10f21115cd044aa2d67d861efa612772e652443a5322329ff31" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.729172 4563 scope.go:117] "RemoveContainer" containerID="625f18238c10176adea1c17579e52dac139cde9d4f01c69e6ee08a37c8665676" Nov 24 09:53:22 crc kubenswrapper[4563]: E1124 09:53:22.729493 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625f18238c10176adea1c17579e52dac139cde9d4f01c69e6ee08a37c8665676\": container with ID starting with 625f18238c10176adea1c17579e52dac139cde9d4f01c69e6ee08a37c8665676 not found: ID does not exist" containerID="625f18238c10176adea1c17579e52dac139cde9d4f01c69e6ee08a37c8665676" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.729519 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625f18238c10176adea1c17579e52dac139cde9d4f01c69e6ee08a37c8665676"} err="failed to get container status \"625f18238c10176adea1c17579e52dac139cde9d4f01c69e6ee08a37c8665676\": rpc error: code = NotFound desc = could not find container \"625f18238c10176adea1c17579e52dac139cde9d4f01c69e6ee08a37c8665676\": container with ID starting with 625f18238c10176adea1c17579e52dac139cde9d4f01c69e6ee08a37c8665676 not found: ID does not exist" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.729537 4563 scope.go:117] "RemoveContainer" containerID="00a26e29f76dbabf0d0f8a0a46d8690d97b29c425f1d6a989580c1446d36aa90" Nov 24 09:53:22 crc kubenswrapper[4563]: E1124 09:53:22.729768 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00a26e29f76dbabf0d0f8a0a46d8690d97b29c425f1d6a989580c1446d36aa90\": container with ID starting with 00a26e29f76dbabf0d0f8a0a46d8690d97b29c425f1d6a989580c1446d36aa90 not found: ID does not exist" containerID="00a26e29f76dbabf0d0f8a0a46d8690d97b29c425f1d6a989580c1446d36aa90" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.729859 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a26e29f76dbabf0d0f8a0a46d8690d97b29c425f1d6a989580c1446d36aa90"} err="failed to get container status \"00a26e29f76dbabf0d0f8a0a46d8690d97b29c425f1d6a989580c1446d36aa90\": rpc error: code = NotFound desc = could not find container \"00a26e29f76dbabf0d0f8a0a46d8690d97b29c425f1d6a989580c1446d36aa90\": container with ID starting with 00a26e29f76dbabf0d0f8a0a46d8690d97b29c425f1d6a989580c1446d36aa90 not found: ID does not exist" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.729944 4563 scope.go:117] "RemoveContainer" containerID="70933930c2bda10f21115cd044aa2d67d861efa612772e652443a5322329ff31" Nov 24 09:53:22 crc kubenswrapper[4563]: E1124 09:53:22.730230 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70933930c2bda10f21115cd044aa2d67d861efa612772e652443a5322329ff31\": container with ID starting with 70933930c2bda10f21115cd044aa2d67d861efa612772e652443a5322329ff31 not found: ID does not exist" containerID="70933930c2bda10f21115cd044aa2d67d861efa612772e652443a5322329ff31" Nov 24 09:53:22 crc kubenswrapper[4563]: I1124 09:53:22.730271 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70933930c2bda10f21115cd044aa2d67d861efa612772e652443a5322329ff31"} err="failed to get container status \"70933930c2bda10f21115cd044aa2d67d861efa612772e652443a5322329ff31\": rpc error: code = NotFound desc = could not find container \"70933930c2bda10f21115cd044aa2d67d861efa612772e652443a5322329ff31\": container with ID starting with 70933930c2bda10f21115cd044aa2d67d861efa612772e652443a5322329ff31 not found: ID does not exist" Nov 24 09:53:23 crc kubenswrapper[4563]: I1124 09:53:23.064548 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb5645ca-0429-44c5-85dd-9aa1cca5aba2" path="/var/lib/kubelet/pods/fb5645ca-0429-44c5-85dd-9aa1cca5aba2/volumes" Nov 24 09:53:50 crc kubenswrapper[4563]: I1124 09:53:50.830424 4563 generic.go:334] "Generic (PLEG): container finished" podID="ff74dd16-80eb-41bb-93bb-91e9f6f96748" containerID="fe5b7a77e60574de53067cedb6181b0255c0f8f4da729ebff79bd2ac207990d8" exitCode=0 Nov 24 09:53:50 crc kubenswrapper[4563]: I1124 09:53:50.830502 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vndt9/must-gather-gxrkl" event={"ID":"ff74dd16-80eb-41bb-93bb-91e9f6f96748","Type":"ContainerDied","Data":"fe5b7a77e60574de53067cedb6181b0255c0f8f4da729ebff79bd2ac207990d8"} Nov 24 09:53:50 crc kubenswrapper[4563]: I1124 09:53:50.831392 4563 scope.go:117] "RemoveContainer" containerID="fe5b7a77e60574de53067cedb6181b0255c0f8f4da729ebff79bd2ac207990d8" Nov 24 09:53:51 crc kubenswrapper[4563]: I1124 09:53:51.207813 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vndt9_must-gather-gxrkl_ff74dd16-80eb-41bb-93bb-91e9f6f96748/gather/0.log" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.223334 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sq8r2"] Nov 24 09:53:53 crc kubenswrapper[4563]: E1124 09:53:53.224437 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5645ca-0429-44c5-85dd-9aa1cca5aba2" containerName="registry-server" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.224450 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5645ca-0429-44c5-85dd-9aa1cca5aba2" containerName="registry-server" Nov 24 09:53:53 crc kubenswrapper[4563]: E1124 09:53:53.224478 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5645ca-0429-44c5-85dd-9aa1cca5aba2" containerName="extract-utilities" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.224484 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5645ca-0429-44c5-85dd-9aa1cca5aba2" containerName="extract-utilities" Nov 24 09:53:53 crc kubenswrapper[4563]: E1124 09:53:53.224505 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5645ca-0429-44c5-85dd-9aa1cca5aba2" containerName="extract-content" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.224510 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5645ca-0429-44c5-85dd-9aa1cca5aba2" containerName="extract-content" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.224712 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5645ca-0429-44c5-85dd-9aa1cca5aba2" containerName="registry-server" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.226046 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.232164 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sq8r2"] Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.301426 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q87m6\" (UniqueName: \"kubernetes.io/projected/429d655b-5312-4c9d-aded-66a40d9c8562-kube-api-access-q87m6\") pod \"community-operators-sq8r2\" (UID: \"429d655b-5312-4c9d-aded-66a40d9c8562\") " pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.301698 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429d655b-5312-4c9d-aded-66a40d9c8562-utilities\") pod \"community-operators-sq8r2\" (UID: \"429d655b-5312-4c9d-aded-66a40d9c8562\") " pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.301856 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429d655b-5312-4c9d-aded-66a40d9c8562-catalog-content\") pod \"community-operators-sq8r2\" (UID: \"429d655b-5312-4c9d-aded-66a40d9c8562\") " pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.403792 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429d655b-5312-4c9d-aded-66a40d9c8562-utilities\") pod \"community-operators-sq8r2\" (UID: \"429d655b-5312-4c9d-aded-66a40d9c8562\") " pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.404124 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429d655b-5312-4c9d-aded-66a40d9c8562-catalog-content\") pod \"community-operators-sq8r2\" (UID: \"429d655b-5312-4c9d-aded-66a40d9c8562\") " pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.404219 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429d655b-5312-4c9d-aded-66a40d9c8562-utilities\") pod \"community-operators-sq8r2\" (UID: \"429d655b-5312-4c9d-aded-66a40d9c8562\") " pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.404435 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q87m6\" (UniqueName: \"kubernetes.io/projected/429d655b-5312-4c9d-aded-66a40d9c8562-kube-api-access-q87m6\") pod \"community-operators-sq8r2\" (UID: \"429d655b-5312-4c9d-aded-66a40d9c8562\") " pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.404442 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429d655b-5312-4c9d-aded-66a40d9c8562-catalog-content\") pod \"community-operators-sq8r2\" (UID: \"429d655b-5312-4c9d-aded-66a40d9c8562\") " pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.421012 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q87m6\" (UniqueName: \"kubernetes.io/projected/429d655b-5312-4c9d-aded-66a40d9c8562-kube-api-access-q87m6\") pod \"community-operators-sq8r2\" (UID: \"429d655b-5312-4c9d-aded-66a40d9c8562\") " pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.551575 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:53:53 crc kubenswrapper[4563]: I1124 09:53:53.940191 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sq8r2"] Nov 24 09:53:54 crc kubenswrapper[4563]: I1124 09:53:54.862474 4563 generic.go:334] "Generic (PLEG): container finished" podID="429d655b-5312-4c9d-aded-66a40d9c8562" containerID="3d8e8f03c2c37ea1dcd4de5979f6463d9f716c2223744812e5cf414c34eab941" exitCode=0 Nov 24 09:53:54 crc kubenswrapper[4563]: I1124 09:53:54.862586 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sq8r2" event={"ID":"429d655b-5312-4c9d-aded-66a40d9c8562","Type":"ContainerDied","Data":"3d8e8f03c2c37ea1dcd4de5979f6463d9f716c2223744812e5cf414c34eab941"} Nov 24 09:53:54 crc kubenswrapper[4563]: I1124 09:53:54.862748 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sq8r2" event={"ID":"429d655b-5312-4c9d-aded-66a40d9c8562","Type":"ContainerStarted","Data":"154dd06a795c501e1a3a3fbb5b69beb370994ec594f782f05b873cda210c3614"} Nov 24 09:53:55 crc kubenswrapper[4563]: I1124 09:53:55.871136 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sq8r2" event={"ID":"429d655b-5312-4c9d-aded-66a40d9c8562","Type":"ContainerStarted","Data":"e6187797c08c07edd99d2be4573dd8f2ef131f60abac98ecc9c06c1c36db41d3"} Nov 24 09:53:56 crc kubenswrapper[4563]: I1124 09:53:56.884037 4563 generic.go:334] "Generic (PLEG): container finished" podID="429d655b-5312-4c9d-aded-66a40d9c8562" containerID="e6187797c08c07edd99d2be4573dd8f2ef131f60abac98ecc9c06c1c36db41d3" exitCode=0 Nov 24 09:53:56 crc kubenswrapper[4563]: I1124 09:53:56.884085 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sq8r2" event={"ID":"429d655b-5312-4c9d-aded-66a40d9c8562","Type":"ContainerDied","Data":"e6187797c08c07edd99d2be4573dd8f2ef131f60abac98ecc9c06c1c36db41d3"} Nov 24 09:53:57 crc kubenswrapper[4563]: I1124 09:53:57.895661 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sq8r2" event={"ID":"429d655b-5312-4c9d-aded-66a40d9c8562","Type":"ContainerStarted","Data":"9d1ecb0f5ce3cc9eac9d069806f331c95dd6e7173f0681fc9d552ca8e6d03e59"} Nov 24 09:53:57 crc kubenswrapper[4563]: I1124 09:53:57.914881 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sq8r2" podStartSLOduration=2.394813537 podStartE2EDuration="4.914861552s" podCreationTimestamp="2025-11-24 09:53:53 +0000 UTC" firstStartedPulling="2025-11-24 09:53:54.863903514 +0000 UTC m=+3012.122880961" lastFinishedPulling="2025-11-24 09:53:57.38395153 +0000 UTC m=+3014.642928976" observedRunningTime="2025-11-24 09:53:57.908358992 +0000 UTC m=+3015.167336439" watchObservedRunningTime="2025-11-24 09:53:57.914861552 +0000 UTC m=+3015.173838999" Nov 24 09:53:58 crc kubenswrapper[4563]: I1124 09:53:58.767043 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vndt9/must-gather-gxrkl"] Nov 24 09:53:58 crc kubenswrapper[4563]: I1124 09:53:58.767316 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vndt9/must-gather-gxrkl" podUID="ff74dd16-80eb-41bb-93bb-91e9f6f96748" containerName="copy" containerID="cri-o://699c09ebab645394b3d5f865e79a18c61b7cecd095639fad1d525aeff5d4a0e5" gracePeriod=2 Nov 24 09:53:58 crc kubenswrapper[4563]: I1124 09:53:58.773529 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vndt9/must-gather-gxrkl"] Nov 24 09:53:58 crc kubenswrapper[4563]: I1124 09:53:58.926531 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vndt9_must-gather-gxrkl_ff74dd16-80eb-41bb-93bb-91e9f6f96748/copy/0.log" Nov 24 09:53:58 crc kubenswrapper[4563]: I1124 09:53:58.927982 4563 generic.go:334] "Generic (PLEG): container finished" podID="ff74dd16-80eb-41bb-93bb-91e9f6f96748" containerID="699c09ebab645394b3d5f865e79a18c61b7cecd095639fad1d525aeff5d4a0e5" exitCode=143 Nov 24 09:53:59 crc kubenswrapper[4563]: I1124 09:53:59.122174 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vndt9_must-gather-gxrkl_ff74dd16-80eb-41bb-93bb-91e9f6f96748/copy/0.log" Nov 24 09:53:59 crc kubenswrapper[4563]: I1124 09:53:59.122475 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/must-gather-gxrkl" Nov 24 09:53:59 crc kubenswrapper[4563]: I1124 09:53:59.311223 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8r7z\" (UniqueName: \"kubernetes.io/projected/ff74dd16-80eb-41bb-93bb-91e9f6f96748-kube-api-access-q8r7z\") pod \"ff74dd16-80eb-41bb-93bb-91e9f6f96748\" (UID: \"ff74dd16-80eb-41bb-93bb-91e9f6f96748\") " Nov 24 09:53:59 crc kubenswrapper[4563]: I1124 09:53:59.311561 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff74dd16-80eb-41bb-93bb-91e9f6f96748-must-gather-output\") pod \"ff74dd16-80eb-41bb-93bb-91e9f6f96748\" (UID: \"ff74dd16-80eb-41bb-93bb-91e9f6f96748\") " Nov 24 09:53:59 crc kubenswrapper[4563]: I1124 09:53:59.317320 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff74dd16-80eb-41bb-93bb-91e9f6f96748-kube-api-access-q8r7z" (OuterVolumeSpecName: "kube-api-access-q8r7z") pod "ff74dd16-80eb-41bb-93bb-91e9f6f96748" (UID: "ff74dd16-80eb-41bb-93bb-91e9f6f96748"). InnerVolumeSpecName "kube-api-access-q8r7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:53:59 crc kubenswrapper[4563]: I1124 09:53:59.407778 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff74dd16-80eb-41bb-93bb-91e9f6f96748-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ff74dd16-80eb-41bb-93bb-91e9f6f96748" (UID: "ff74dd16-80eb-41bb-93bb-91e9f6f96748"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:53:59 crc kubenswrapper[4563]: I1124 09:53:59.414612 4563 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff74dd16-80eb-41bb-93bb-91e9f6f96748-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 24 09:53:59 crc kubenswrapper[4563]: I1124 09:53:59.414658 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8r7z\" (UniqueName: \"kubernetes.io/projected/ff74dd16-80eb-41bb-93bb-91e9f6f96748-kube-api-access-q8r7z\") on node \"crc\" DevicePath \"\"" Nov 24 09:53:59 crc kubenswrapper[4563]: I1124 09:53:59.935822 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vndt9_must-gather-gxrkl_ff74dd16-80eb-41bb-93bb-91e9f6f96748/copy/0.log" Nov 24 09:53:59 crc kubenswrapper[4563]: I1124 09:53:59.936177 4563 scope.go:117] "RemoveContainer" containerID="699c09ebab645394b3d5f865e79a18c61b7cecd095639fad1d525aeff5d4a0e5" Nov 24 09:53:59 crc kubenswrapper[4563]: I1124 09:53:59.936206 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vndt9/must-gather-gxrkl" Nov 24 09:53:59 crc kubenswrapper[4563]: I1124 09:53:59.951197 4563 scope.go:117] "RemoveContainer" containerID="fe5b7a77e60574de53067cedb6181b0255c0f8f4da729ebff79bd2ac207990d8" Nov 24 09:54:01 crc kubenswrapper[4563]: I1124 09:54:01.063392 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff74dd16-80eb-41bb-93bb-91e9f6f96748" path="/var/lib/kubelet/pods/ff74dd16-80eb-41bb-93bb-91e9f6f96748/volumes" Nov 24 09:54:03 crc kubenswrapper[4563]: I1124 09:54:03.552354 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:54:03 crc kubenswrapper[4563]: I1124 09:54:03.552845 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:54:03 crc kubenswrapper[4563]: I1124 09:54:03.655825 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:54:04 crc kubenswrapper[4563]: I1124 09:54:04.016867 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:54:04 crc kubenswrapper[4563]: I1124 09:54:04.054140 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sq8r2"] Nov 24 09:54:05 crc kubenswrapper[4563]: I1124 09:54:05.998786 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sq8r2" podUID="429d655b-5312-4c9d-aded-66a40d9c8562" containerName="registry-server" containerID="cri-o://9d1ecb0f5ce3cc9eac9d069806f331c95dd6e7173f0681fc9d552ca8e6d03e59" gracePeriod=2 Nov 24 09:54:06 crc kubenswrapper[4563]: I1124 09:54:06.402618 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:54:06 crc kubenswrapper[4563]: I1124 09:54:06.563331 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429d655b-5312-4c9d-aded-66a40d9c8562-utilities\") pod \"429d655b-5312-4c9d-aded-66a40d9c8562\" (UID: \"429d655b-5312-4c9d-aded-66a40d9c8562\") " Nov 24 09:54:06 crc kubenswrapper[4563]: I1124 09:54:06.563592 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q87m6\" (UniqueName: \"kubernetes.io/projected/429d655b-5312-4c9d-aded-66a40d9c8562-kube-api-access-q87m6\") pod \"429d655b-5312-4c9d-aded-66a40d9c8562\" (UID: \"429d655b-5312-4c9d-aded-66a40d9c8562\") " Nov 24 09:54:06 crc kubenswrapper[4563]: I1124 09:54:06.563759 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429d655b-5312-4c9d-aded-66a40d9c8562-catalog-content\") pod \"429d655b-5312-4c9d-aded-66a40d9c8562\" (UID: \"429d655b-5312-4c9d-aded-66a40d9c8562\") " Nov 24 09:54:06 crc kubenswrapper[4563]: I1124 09:54:06.564113 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/429d655b-5312-4c9d-aded-66a40d9c8562-utilities" (OuterVolumeSpecName: "utilities") pod "429d655b-5312-4c9d-aded-66a40d9c8562" (UID: "429d655b-5312-4c9d-aded-66a40d9c8562"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:54:06 crc kubenswrapper[4563]: I1124 09:54:06.564826 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429d655b-5312-4c9d-aded-66a40d9c8562-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:54:06 crc kubenswrapper[4563]: I1124 09:54:06.569827 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429d655b-5312-4c9d-aded-66a40d9c8562-kube-api-access-q87m6" (OuterVolumeSpecName: "kube-api-access-q87m6") pod "429d655b-5312-4c9d-aded-66a40d9c8562" (UID: "429d655b-5312-4c9d-aded-66a40d9c8562"). InnerVolumeSpecName "kube-api-access-q87m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:54:06 crc kubenswrapper[4563]: I1124 09:54:06.602353 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/429d655b-5312-4c9d-aded-66a40d9c8562-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "429d655b-5312-4c9d-aded-66a40d9c8562" (UID: "429d655b-5312-4c9d-aded-66a40d9c8562"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:54:06 crc kubenswrapper[4563]: I1124 09:54:06.667014 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q87m6\" (UniqueName: \"kubernetes.io/projected/429d655b-5312-4c9d-aded-66a40d9c8562-kube-api-access-q87m6\") on node \"crc\" DevicePath \"\"" Nov 24 09:54:06 crc kubenswrapper[4563]: I1124 09:54:06.667052 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429d655b-5312-4c9d-aded-66a40d9c8562-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.010193 4563 generic.go:334] "Generic (PLEG): container finished" podID="429d655b-5312-4c9d-aded-66a40d9c8562" containerID="9d1ecb0f5ce3cc9eac9d069806f331c95dd6e7173f0681fc9d552ca8e6d03e59" exitCode=0 Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.010250 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sq8r2" event={"ID":"429d655b-5312-4c9d-aded-66a40d9c8562","Type":"ContainerDied","Data":"9d1ecb0f5ce3cc9eac9d069806f331c95dd6e7173f0681fc9d552ca8e6d03e59"} Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.010467 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sq8r2" event={"ID":"429d655b-5312-4c9d-aded-66a40d9c8562","Type":"ContainerDied","Data":"154dd06a795c501e1a3a3fbb5b69beb370994ec594f782f05b873cda210c3614"} Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.010497 4563 scope.go:117] "RemoveContainer" containerID="9d1ecb0f5ce3cc9eac9d069806f331c95dd6e7173f0681fc9d552ca8e6d03e59" Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.010324 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sq8r2" Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.038275 4563 scope.go:117] "RemoveContainer" containerID="e6187797c08c07edd99d2be4573dd8f2ef131f60abac98ecc9c06c1c36db41d3" Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.041960 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sq8r2"] Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.048595 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sq8r2"] Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.055585 4563 scope.go:117] "RemoveContainer" containerID="3d8e8f03c2c37ea1dcd4de5979f6463d9f716c2223744812e5cf414c34eab941" Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.064433 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="429d655b-5312-4c9d-aded-66a40d9c8562" path="/var/lib/kubelet/pods/429d655b-5312-4c9d-aded-66a40d9c8562/volumes" Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.089895 4563 scope.go:117] "RemoveContainer" containerID="9d1ecb0f5ce3cc9eac9d069806f331c95dd6e7173f0681fc9d552ca8e6d03e59" Nov 24 09:54:07 crc kubenswrapper[4563]: E1124 09:54:07.090312 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d1ecb0f5ce3cc9eac9d069806f331c95dd6e7173f0681fc9d552ca8e6d03e59\": container with ID starting with 9d1ecb0f5ce3cc9eac9d069806f331c95dd6e7173f0681fc9d552ca8e6d03e59 not found: ID does not exist" containerID="9d1ecb0f5ce3cc9eac9d069806f331c95dd6e7173f0681fc9d552ca8e6d03e59" Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.090354 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1ecb0f5ce3cc9eac9d069806f331c95dd6e7173f0681fc9d552ca8e6d03e59"} err="failed to get container status \"9d1ecb0f5ce3cc9eac9d069806f331c95dd6e7173f0681fc9d552ca8e6d03e59\": rpc error: code = NotFound desc = could not find container \"9d1ecb0f5ce3cc9eac9d069806f331c95dd6e7173f0681fc9d552ca8e6d03e59\": container with ID starting with 9d1ecb0f5ce3cc9eac9d069806f331c95dd6e7173f0681fc9d552ca8e6d03e59 not found: ID does not exist" Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.090383 4563 scope.go:117] "RemoveContainer" containerID="e6187797c08c07edd99d2be4573dd8f2ef131f60abac98ecc9c06c1c36db41d3" Nov 24 09:54:07 crc kubenswrapper[4563]: E1124 09:54:07.090692 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6187797c08c07edd99d2be4573dd8f2ef131f60abac98ecc9c06c1c36db41d3\": container with ID starting with e6187797c08c07edd99d2be4573dd8f2ef131f60abac98ecc9c06c1c36db41d3 not found: ID does not exist" containerID="e6187797c08c07edd99d2be4573dd8f2ef131f60abac98ecc9c06c1c36db41d3" Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.090716 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6187797c08c07edd99d2be4573dd8f2ef131f60abac98ecc9c06c1c36db41d3"} err="failed to get container status \"e6187797c08c07edd99d2be4573dd8f2ef131f60abac98ecc9c06c1c36db41d3\": rpc error: code = NotFound desc = could not find container \"e6187797c08c07edd99d2be4573dd8f2ef131f60abac98ecc9c06c1c36db41d3\": container with ID starting with e6187797c08c07edd99d2be4573dd8f2ef131f60abac98ecc9c06c1c36db41d3 not found: ID does not exist" Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.090729 4563 scope.go:117] "RemoveContainer" containerID="3d8e8f03c2c37ea1dcd4de5979f6463d9f716c2223744812e5cf414c34eab941" Nov 24 09:54:07 crc kubenswrapper[4563]: E1124 09:54:07.092278 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d8e8f03c2c37ea1dcd4de5979f6463d9f716c2223744812e5cf414c34eab941\": container with ID starting with 3d8e8f03c2c37ea1dcd4de5979f6463d9f716c2223744812e5cf414c34eab941 not found: ID does not exist" containerID="3d8e8f03c2c37ea1dcd4de5979f6463d9f716c2223744812e5cf414c34eab941" Nov 24 09:54:07 crc kubenswrapper[4563]: I1124 09:54:07.092299 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d8e8f03c2c37ea1dcd4de5979f6463d9f716c2223744812e5cf414c34eab941"} err="failed to get container status \"3d8e8f03c2c37ea1dcd4de5979f6463d9f716c2223744812e5cf414c34eab941\": rpc error: code = NotFound desc = could not find container \"3d8e8f03c2c37ea1dcd4de5979f6463d9f716c2223744812e5cf414c34eab941\": container with ID starting with 3d8e8f03c2c37ea1dcd4de5979f6463d9f716c2223744812e5cf414c34eab941 not found: ID does not exist" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.645709 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-snk4r"] Nov 24 09:54:11 crc kubenswrapper[4563]: E1124 09:54:11.646508 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429d655b-5312-4c9d-aded-66a40d9c8562" containerName="extract-utilities" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.646523 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="429d655b-5312-4c9d-aded-66a40d9c8562" containerName="extract-utilities" Nov 24 09:54:11 crc kubenswrapper[4563]: E1124 09:54:11.646535 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff74dd16-80eb-41bb-93bb-91e9f6f96748" containerName="gather" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.646541 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff74dd16-80eb-41bb-93bb-91e9f6f96748" containerName="gather" Nov 24 09:54:11 crc kubenswrapper[4563]: E1124 09:54:11.646569 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff74dd16-80eb-41bb-93bb-91e9f6f96748" containerName="copy" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.646575 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff74dd16-80eb-41bb-93bb-91e9f6f96748" containerName="copy" Nov 24 09:54:11 crc kubenswrapper[4563]: E1124 09:54:11.646583 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429d655b-5312-4c9d-aded-66a40d9c8562" containerName="registry-server" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.646588 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="429d655b-5312-4c9d-aded-66a40d9c8562" containerName="registry-server" Nov 24 09:54:11 crc kubenswrapper[4563]: E1124 09:54:11.646600 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429d655b-5312-4c9d-aded-66a40d9c8562" containerName="extract-content" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.646605 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="429d655b-5312-4c9d-aded-66a40d9c8562" containerName="extract-content" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.646793 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff74dd16-80eb-41bb-93bb-91e9f6f96748" containerName="copy" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.646822 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff74dd16-80eb-41bb-93bb-91e9f6f96748" containerName="gather" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.646830 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="429d655b-5312-4c9d-aded-66a40d9c8562" containerName="registry-server" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.647999 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.656904 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zglwv\" (UniqueName: \"kubernetes.io/projected/d4d156a5-4986-4055-b6fc-0f23eb3e519a-kube-api-access-zglwv\") pod \"redhat-operators-snk4r\" (UID: \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\") " pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.657102 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d156a5-4986-4055-b6fc-0f23eb3e519a-utilities\") pod \"redhat-operators-snk4r\" (UID: \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\") " pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.657155 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d156a5-4986-4055-b6fc-0f23eb3e519a-catalog-content\") pod \"redhat-operators-snk4r\" (UID: \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\") " pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.658936 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-snk4r"] Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.759387 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zglwv\" (UniqueName: \"kubernetes.io/projected/d4d156a5-4986-4055-b6fc-0f23eb3e519a-kube-api-access-zglwv\") pod \"redhat-operators-snk4r\" (UID: \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\") " pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.759517 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d156a5-4986-4055-b6fc-0f23eb3e519a-utilities\") pod \"redhat-operators-snk4r\" (UID: \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\") " pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.759561 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d156a5-4986-4055-b6fc-0f23eb3e519a-catalog-content\") pod \"redhat-operators-snk4r\" (UID: \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\") " pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.760104 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d156a5-4986-4055-b6fc-0f23eb3e519a-utilities\") pod \"redhat-operators-snk4r\" (UID: \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\") " pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.760128 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d156a5-4986-4055-b6fc-0f23eb3e519a-catalog-content\") pod \"redhat-operators-snk4r\" (UID: \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\") " pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.779819 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zglwv\" (UniqueName: \"kubernetes.io/projected/d4d156a5-4986-4055-b6fc-0f23eb3e519a-kube-api-access-zglwv\") pod \"redhat-operators-snk4r\" (UID: \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\") " pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:11 crc kubenswrapper[4563]: I1124 09:54:11.962437 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:12 crc kubenswrapper[4563]: I1124 09:54:12.362193 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-snk4r"] Nov 24 09:54:12 crc kubenswrapper[4563]: W1124 09:54:12.365686 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4d156a5_4986_4055_b6fc_0f23eb3e519a.slice/crio-33e6fadad3bdbd220c46cf9c342a9230513ff6ee6921ced3000bfbb13eff6c21 WatchSource:0}: Error finding container 33e6fadad3bdbd220c46cf9c342a9230513ff6ee6921ced3000bfbb13eff6c21: Status 404 returned error can't find the container with id 33e6fadad3bdbd220c46cf9c342a9230513ff6ee6921ced3000bfbb13eff6c21 Nov 24 09:54:13 crc kubenswrapper[4563]: I1124 09:54:13.087142 4563 generic.go:334] "Generic (PLEG): container finished" podID="d4d156a5-4986-4055-b6fc-0f23eb3e519a" containerID="8c2fc916ab4034bf98b302cf597dbfc17adf6d09b3cabcadee1178e53750e434" exitCode=0 Nov 24 09:54:13 crc kubenswrapper[4563]: I1124 09:54:13.087249 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snk4r" event={"ID":"d4d156a5-4986-4055-b6fc-0f23eb3e519a","Type":"ContainerDied","Data":"8c2fc916ab4034bf98b302cf597dbfc17adf6d09b3cabcadee1178e53750e434"} Nov 24 09:54:13 crc kubenswrapper[4563]: I1124 09:54:13.087516 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snk4r" event={"ID":"d4d156a5-4986-4055-b6fc-0f23eb3e519a","Type":"ContainerStarted","Data":"33e6fadad3bdbd220c46cf9c342a9230513ff6ee6921ced3000bfbb13eff6c21"} Nov 24 09:54:14 crc kubenswrapper[4563]: I1124 09:54:14.096188 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snk4r" event={"ID":"d4d156a5-4986-4055-b6fc-0f23eb3e519a","Type":"ContainerStarted","Data":"c44cc4d54d4d02f666e3e79a43f7d4acfd78fb82204101072c1f0efa138932f5"} Nov 24 09:54:16 crc kubenswrapper[4563]: I1124 09:54:16.115186 4563 generic.go:334] "Generic (PLEG): container finished" podID="d4d156a5-4986-4055-b6fc-0f23eb3e519a" containerID="c44cc4d54d4d02f666e3e79a43f7d4acfd78fb82204101072c1f0efa138932f5" exitCode=0 Nov 24 09:54:16 crc kubenswrapper[4563]: I1124 09:54:16.115275 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snk4r" event={"ID":"d4d156a5-4986-4055-b6fc-0f23eb3e519a","Type":"ContainerDied","Data":"c44cc4d54d4d02f666e3e79a43f7d4acfd78fb82204101072c1f0efa138932f5"} Nov 24 09:54:17 crc kubenswrapper[4563]: I1124 09:54:17.127514 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snk4r" event={"ID":"d4d156a5-4986-4055-b6fc-0f23eb3e519a","Type":"ContainerStarted","Data":"28700df2436ce086aba6cdbd0491f5be96037a85dca61874d34336fff5a631d2"} Nov 24 09:54:17 crc kubenswrapper[4563]: I1124 09:54:17.147545 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-snk4r" podStartSLOduration=2.644090525 podStartE2EDuration="6.147528184s" podCreationTimestamp="2025-11-24 09:54:11 +0000 UTC" firstStartedPulling="2025-11-24 09:54:13.089458898 +0000 UTC m=+3030.348436345" lastFinishedPulling="2025-11-24 09:54:16.592896557 +0000 UTC m=+3033.851874004" observedRunningTime="2025-11-24 09:54:17.144804338 +0000 UTC m=+3034.403781786" watchObservedRunningTime="2025-11-24 09:54:17.147528184 +0000 UTC m=+3034.406505631" Nov 24 09:54:21 crc kubenswrapper[4563]: I1124 09:54:21.962875 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:21 crc kubenswrapper[4563]: I1124 09:54:21.963225 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:22 crc kubenswrapper[4563]: I1124 09:54:22.000364 4563 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:22 crc kubenswrapper[4563]: I1124 09:54:22.212390 4563 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:22 crc kubenswrapper[4563]: I1124 09:54:22.249926 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-snk4r"] Nov 24 09:54:24 crc kubenswrapper[4563]: I1124 09:54:24.190429 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-snk4r" podUID="d4d156a5-4986-4055-b6fc-0f23eb3e519a" containerName="registry-server" containerID="cri-o://28700df2436ce086aba6cdbd0491f5be96037a85dca61874d34336fff5a631d2" gracePeriod=2 Nov 24 09:54:24 crc kubenswrapper[4563]: I1124 09:54:24.567290 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:24 crc kubenswrapper[4563]: I1124 09:54:24.723400 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zglwv\" (UniqueName: \"kubernetes.io/projected/d4d156a5-4986-4055-b6fc-0f23eb3e519a-kube-api-access-zglwv\") pod \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\" (UID: \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\") " Nov 24 09:54:24 crc kubenswrapper[4563]: I1124 09:54:24.723460 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d156a5-4986-4055-b6fc-0f23eb3e519a-catalog-content\") pod \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\" (UID: \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\") " Nov 24 09:54:24 crc kubenswrapper[4563]: I1124 09:54:24.723547 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d156a5-4986-4055-b6fc-0f23eb3e519a-utilities\") pod \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\" (UID: \"d4d156a5-4986-4055-b6fc-0f23eb3e519a\") " Nov 24 09:54:24 crc kubenswrapper[4563]: I1124 09:54:24.724392 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4d156a5-4986-4055-b6fc-0f23eb3e519a-utilities" (OuterVolumeSpecName: "utilities") pod "d4d156a5-4986-4055-b6fc-0f23eb3e519a" (UID: "d4d156a5-4986-4055-b6fc-0f23eb3e519a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:54:24 crc kubenswrapper[4563]: I1124 09:54:24.728954 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d156a5-4986-4055-b6fc-0f23eb3e519a-kube-api-access-zglwv" (OuterVolumeSpecName: "kube-api-access-zglwv") pod "d4d156a5-4986-4055-b6fc-0f23eb3e519a" (UID: "d4d156a5-4986-4055-b6fc-0f23eb3e519a"). InnerVolumeSpecName "kube-api-access-zglwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:54:24 crc kubenswrapper[4563]: I1124 09:54:24.825847 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zglwv\" (UniqueName: \"kubernetes.io/projected/d4d156a5-4986-4055-b6fc-0f23eb3e519a-kube-api-access-zglwv\") on node \"crc\" DevicePath \"\"" Nov 24 09:54:24 crc kubenswrapper[4563]: I1124 09:54:24.825902 4563 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d156a5-4986-4055-b6fc-0f23eb3e519a-utilities\") on node \"crc\" DevicePath \"\"" Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.201834 4563 generic.go:334] "Generic (PLEG): container finished" podID="d4d156a5-4986-4055-b6fc-0f23eb3e519a" containerID="28700df2436ce086aba6cdbd0491f5be96037a85dca61874d34336fff5a631d2" exitCode=0 Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.201888 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snk4r" event={"ID":"d4d156a5-4986-4055-b6fc-0f23eb3e519a","Type":"ContainerDied","Data":"28700df2436ce086aba6cdbd0491f5be96037a85dca61874d34336fff5a631d2"} Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.201920 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snk4r" event={"ID":"d4d156a5-4986-4055-b6fc-0f23eb3e519a","Type":"ContainerDied","Data":"33e6fadad3bdbd220c46cf9c342a9230513ff6ee6921ced3000bfbb13eff6c21"} Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.201939 4563 scope.go:117] "RemoveContainer" containerID="28700df2436ce086aba6cdbd0491f5be96037a85dca61874d34336fff5a631d2" Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.201958 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snk4r" Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.218513 4563 scope.go:117] "RemoveContainer" containerID="c44cc4d54d4d02f666e3e79a43f7d4acfd78fb82204101072c1f0efa138932f5" Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.232921 4563 scope.go:117] "RemoveContainer" containerID="8c2fc916ab4034bf98b302cf597dbfc17adf6d09b3cabcadee1178e53750e434" Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.266219 4563 scope.go:117] "RemoveContainer" containerID="28700df2436ce086aba6cdbd0491f5be96037a85dca61874d34336fff5a631d2" Nov 24 09:54:25 crc kubenswrapper[4563]: E1124 09:54:25.266589 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28700df2436ce086aba6cdbd0491f5be96037a85dca61874d34336fff5a631d2\": container with ID starting with 28700df2436ce086aba6cdbd0491f5be96037a85dca61874d34336fff5a631d2 not found: ID does not exist" containerID="28700df2436ce086aba6cdbd0491f5be96037a85dca61874d34336fff5a631d2" Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.266623 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28700df2436ce086aba6cdbd0491f5be96037a85dca61874d34336fff5a631d2"} err="failed to get container status \"28700df2436ce086aba6cdbd0491f5be96037a85dca61874d34336fff5a631d2\": rpc error: code = NotFound desc = could not find container \"28700df2436ce086aba6cdbd0491f5be96037a85dca61874d34336fff5a631d2\": container with ID starting with 28700df2436ce086aba6cdbd0491f5be96037a85dca61874d34336fff5a631d2 not found: ID does not exist" Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.266659 4563 scope.go:117] "RemoveContainer" containerID="c44cc4d54d4d02f666e3e79a43f7d4acfd78fb82204101072c1f0efa138932f5" Nov 24 09:54:25 crc kubenswrapper[4563]: E1124 09:54:25.267193 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c44cc4d54d4d02f666e3e79a43f7d4acfd78fb82204101072c1f0efa138932f5\": container with ID starting with c44cc4d54d4d02f666e3e79a43f7d4acfd78fb82204101072c1f0efa138932f5 not found: ID does not exist" containerID="c44cc4d54d4d02f666e3e79a43f7d4acfd78fb82204101072c1f0efa138932f5" Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.267216 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c44cc4d54d4d02f666e3e79a43f7d4acfd78fb82204101072c1f0efa138932f5"} err="failed to get container status \"c44cc4d54d4d02f666e3e79a43f7d4acfd78fb82204101072c1f0efa138932f5\": rpc error: code = NotFound desc = could not find container \"c44cc4d54d4d02f666e3e79a43f7d4acfd78fb82204101072c1f0efa138932f5\": container with ID starting with c44cc4d54d4d02f666e3e79a43f7d4acfd78fb82204101072c1f0efa138932f5 not found: ID does not exist" Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.267232 4563 scope.go:117] "RemoveContainer" containerID="8c2fc916ab4034bf98b302cf597dbfc17adf6d09b3cabcadee1178e53750e434" Nov 24 09:54:25 crc kubenswrapper[4563]: E1124 09:54:25.267630 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c2fc916ab4034bf98b302cf597dbfc17adf6d09b3cabcadee1178e53750e434\": container with ID starting with 8c2fc916ab4034bf98b302cf597dbfc17adf6d09b3cabcadee1178e53750e434 not found: ID does not exist" containerID="8c2fc916ab4034bf98b302cf597dbfc17adf6d09b3cabcadee1178e53750e434" Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.267727 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2fc916ab4034bf98b302cf597dbfc17adf6d09b3cabcadee1178e53750e434"} err="failed to get container status \"8c2fc916ab4034bf98b302cf597dbfc17adf6d09b3cabcadee1178e53750e434\": rpc error: code = NotFound desc = could not find container \"8c2fc916ab4034bf98b302cf597dbfc17adf6d09b3cabcadee1178e53750e434\": container with ID starting with 8c2fc916ab4034bf98b302cf597dbfc17adf6d09b3cabcadee1178e53750e434 not found: ID does not exist" Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.339487 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4d156a5-4986-4055-b6fc-0f23eb3e519a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4d156a5-4986-4055-b6fc-0f23eb3e519a" (UID: "d4d156a5-4986-4055-b6fc-0f23eb3e519a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.439179 4563 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d156a5-4986-4055-b6fc-0f23eb3e519a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.528128 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-snk4r"] Nov 24 09:54:25 crc kubenswrapper[4563]: I1124 09:54:25.533416 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-snk4r"] Nov 24 09:54:27 crc kubenswrapper[4563]: I1124 09:54:27.063111 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d156a5-4986-4055-b6fc-0f23eb3e519a" path="/var/lib/kubelet/pods/d4d156a5-4986-4055-b6fc-0f23eb3e519a/volumes" Nov 24 09:54:38 crc kubenswrapper[4563]: I1124 09:54:38.987372 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:54:38 crc kubenswrapper[4563]: I1124 09:54:38.987824 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:55:08 crc kubenswrapper[4563]: I1124 09:55:08.987113 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:55:08 crc kubenswrapper[4563]: I1124 09:55:08.987564 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:55:38 crc kubenswrapper[4563]: I1124 09:55:38.987480 4563 patch_prober.go:28] interesting pod/machine-config-daemon-stlxr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 24 09:55:38 crc kubenswrapper[4563]: I1124 09:55:38.987872 4563 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 24 09:55:38 crc kubenswrapper[4563]: I1124 09:55:38.987926 4563 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" Nov 24 09:55:38 crc kubenswrapper[4563]: I1124 09:55:38.988617 4563 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108"} pod="openshift-machine-config-operator/machine-config-daemon-stlxr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 24 09:55:38 crc kubenswrapper[4563]: I1124 09:55:38.988682 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" containerName="machine-config-daemon" containerID="cri-o://b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" gracePeriod=600 Nov 24 09:55:39 crc kubenswrapper[4563]: E1124 09:55:39.109498 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:55:39 crc kubenswrapper[4563]: I1124 09:55:39.714528 4563 generic.go:334] "Generic (PLEG): container finished" podID="3b2bfe55-8989-49b3-bb61-e28189447627" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" exitCode=0 Nov 24 09:55:39 crc kubenswrapper[4563]: I1124 09:55:39.714582 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerDied","Data":"b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108"} Nov 24 09:55:39 crc kubenswrapper[4563]: I1124 09:55:39.715108 4563 scope.go:117] "RemoveContainer" containerID="61bede8c7960fe7299ad0cfd68d688b281ba1733220ddc88fefa904a4696a51c" Nov 24 09:55:39 crc kubenswrapper[4563]: I1124 09:55:39.715431 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:55:39 crc kubenswrapper[4563]: E1124 09:55:39.715726 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:55:50 crc kubenswrapper[4563]: I1124 09:55:50.055298 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:55:50 crc kubenswrapper[4563]: E1124 09:55:50.055838 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.054542 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:56:05 crc kubenswrapper[4563]: E1124 09:56:05.055132 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.670597 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h7gkj/must-gather-v87wf"] Nov 24 09:56:05 crc kubenswrapper[4563]: E1124 09:56:05.671190 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d156a5-4986-4055-b6fc-0f23eb3e519a" containerName="registry-server" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.671208 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d156a5-4986-4055-b6fc-0f23eb3e519a" containerName="registry-server" Nov 24 09:56:05 crc kubenswrapper[4563]: E1124 09:56:05.671226 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d156a5-4986-4055-b6fc-0f23eb3e519a" containerName="extract-utilities" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.671232 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d156a5-4986-4055-b6fc-0f23eb3e519a" containerName="extract-utilities" Nov 24 09:56:05 crc kubenswrapper[4563]: E1124 09:56:05.671256 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d156a5-4986-4055-b6fc-0f23eb3e519a" containerName="extract-content" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.671262 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d156a5-4986-4055-b6fc-0f23eb3e519a" containerName="extract-content" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.671432 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d156a5-4986-4055-b6fc-0f23eb3e519a" containerName="registry-server" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.672359 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/must-gather-v87wf" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.673724 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-h7gkj"/"kube-root-ca.crt" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.674057 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-h7gkj"/"openshift-service-ca.crt" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.681573 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h7gkj/must-gather-v87wf"] Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.732514 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73563124-8005-419b-ae79-70a52a25a823-must-gather-output\") pod \"must-gather-v87wf\" (UID: \"73563124-8005-419b-ae79-70a52a25a823\") " pod="openshift-must-gather-h7gkj/must-gather-v87wf" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.733030 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krlx9\" (UniqueName: \"kubernetes.io/projected/73563124-8005-419b-ae79-70a52a25a823-kube-api-access-krlx9\") pod \"must-gather-v87wf\" (UID: \"73563124-8005-419b-ae79-70a52a25a823\") " pod="openshift-must-gather-h7gkj/must-gather-v87wf" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.833835 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krlx9\" (UniqueName: \"kubernetes.io/projected/73563124-8005-419b-ae79-70a52a25a823-kube-api-access-krlx9\") pod \"must-gather-v87wf\" (UID: \"73563124-8005-419b-ae79-70a52a25a823\") " pod="openshift-must-gather-h7gkj/must-gather-v87wf" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.833925 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73563124-8005-419b-ae79-70a52a25a823-must-gather-output\") pod \"must-gather-v87wf\" (UID: \"73563124-8005-419b-ae79-70a52a25a823\") " pod="openshift-must-gather-h7gkj/must-gather-v87wf" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.834296 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73563124-8005-419b-ae79-70a52a25a823-must-gather-output\") pod \"must-gather-v87wf\" (UID: \"73563124-8005-419b-ae79-70a52a25a823\") " pod="openshift-must-gather-h7gkj/must-gather-v87wf" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.849783 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krlx9\" (UniqueName: \"kubernetes.io/projected/73563124-8005-419b-ae79-70a52a25a823-kube-api-access-krlx9\") pod \"must-gather-v87wf\" (UID: \"73563124-8005-419b-ae79-70a52a25a823\") " pod="openshift-must-gather-h7gkj/must-gather-v87wf" Nov 24 09:56:05 crc kubenswrapper[4563]: I1124 09:56:05.986632 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/must-gather-v87wf" Nov 24 09:56:06 crc kubenswrapper[4563]: I1124 09:56:06.382537 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h7gkj/must-gather-v87wf"] Nov 24 09:56:06 crc kubenswrapper[4563]: I1124 09:56:06.913350 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h7gkj/must-gather-v87wf" event={"ID":"73563124-8005-419b-ae79-70a52a25a823","Type":"ContainerStarted","Data":"a4fb22a31c4810ea4f9a967db90b4ba0697fa6e5cb579ad943ecfc162521b9c1"} Nov 24 09:56:06 crc kubenswrapper[4563]: I1124 09:56:06.913578 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h7gkj/must-gather-v87wf" event={"ID":"73563124-8005-419b-ae79-70a52a25a823","Type":"ContainerStarted","Data":"7c4f9ff8c22199340577363c0c05b7c2015f0a43cf29aec58e24be79a519a2f0"} Nov 24 09:56:06 crc kubenswrapper[4563]: I1124 09:56:06.913593 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h7gkj/must-gather-v87wf" event={"ID":"73563124-8005-419b-ae79-70a52a25a823","Type":"ContainerStarted","Data":"a5e69689cf8cfeafa499f962396fc8c70652f7e658022cd36cc8496f4a68f607"} Nov 24 09:56:06 crc kubenswrapper[4563]: I1124 09:56:06.927479 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h7gkj/must-gather-v87wf" podStartSLOduration=1.927469503 podStartE2EDuration="1.927469503s" podCreationTimestamp="2025-11-24 09:56:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:56:06.924049103 +0000 UTC m=+3144.183026551" watchObservedRunningTime="2025-11-24 09:56:06.927469503 +0000 UTC m=+3144.186446949" Nov 24 09:56:09 crc kubenswrapper[4563]: I1124 09:56:09.120002 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h7gkj/crc-debug-cq6dl"] Nov 24 09:56:09 crc kubenswrapper[4563]: I1124 09:56:09.121332 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/crc-debug-cq6dl" Nov 24 09:56:09 crc kubenswrapper[4563]: I1124 09:56:09.123363 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-h7gkj"/"default-dockercfg-6wfzz" Nov 24 09:56:09 crc kubenswrapper[4563]: I1124 09:56:09.197725 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8k58\" (UniqueName: \"kubernetes.io/projected/b787d135-f802-4a18-bdbb-7d775f277afe-kube-api-access-h8k58\") pod \"crc-debug-cq6dl\" (UID: \"b787d135-f802-4a18-bdbb-7d775f277afe\") " pod="openshift-must-gather-h7gkj/crc-debug-cq6dl" Nov 24 09:56:09 crc kubenswrapper[4563]: I1124 09:56:09.197972 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b787d135-f802-4a18-bdbb-7d775f277afe-host\") pod \"crc-debug-cq6dl\" (UID: \"b787d135-f802-4a18-bdbb-7d775f277afe\") " pod="openshift-must-gather-h7gkj/crc-debug-cq6dl" Nov 24 09:56:09 crc kubenswrapper[4563]: I1124 09:56:09.299359 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8k58\" (UniqueName: \"kubernetes.io/projected/b787d135-f802-4a18-bdbb-7d775f277afe-kube-api-access-h8k58\") pod \"crc-debug-cq6dl\" (UID: \"b787d135-f802-4a18-bdbb-7d775f277afe\") " pod="openshift-must-gather-h7gkj/crc-debug-cq6dl" Nov 24 09:56:09 crc kubenswrapper[4563]: I1124 09:56:09.299506 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b787d135-f802-4a18-bdbb-7d775f277afe-host\") pod \"crc-debug-cq6dl\" (UID: \"b787d135-f802-4a18-bdbb-7d775f277afe\") " pod="openshift-must-gather-h7gkj/crc-debug-cq6dl" Nov 24 09:56:09 crc kubenswrapper[4563]: I1124 09:56:09.299711 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b787d135-f802-4a18-bdbb-7d775f277afe-host\") pod \"crc-debug-cq6dl\" (UID: \"b787d135-f802-4a18-bdbb-7d775f277afe\") " pod="openshift-must-gather-h7gkj/crc-debug-cq6dl" Nov 24 09:56:09 crc kubenswrapper[4563]: I1124 09:56:09.315330 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8k58\" (UniqueName: \"kubernetes.io/projected/b787d135-f802-4a18-bdbb-7d775f277afe-kube-api-access-h8k58\") pod \"crc-debug-cq6dl\" (UID: \"b787d135-f802-4a18-bdbb-7d775f277afe\") " pod="openshift-must-gather-h7gkj/crc-debug-cq6dl" Nov 24 09:56:09 crc kubenswrapper[4563]: I1124 09:56:09.441321 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/crc-debug-cq6dl" Nov 24 09:56:09 crc kubenswrapper[4563]: I1124 09:56:09.934732 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h7gkj/crc-debug-cq6dl" event={"ID":"b787d135-f802-4a18-bdbb-7d775f277afe","Type":"ContainerStarted","Data":"dd847fd3b8d268414fba2adf2d728849e2b078cafc8c0d8959f67bfa01218c38"} Nov 24 09:56:09 crc kubenswrapper[4563]: I1124 09:56:09.935103 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h7gkj/crc-debug-cq6dl" event={"ID":"b787d135-f802-4a18-bdbb-7d775f277afe","Type":"ContainerStarted","Data":"a4ef76c9078e63dedda0c67c4f50fa5c3abc2daf7f58b3b89d309049da63aa24"} Nov 24 09:56:09 crc kubenswrapper[4563]: I1124 09:56:09.945974 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h7gkj/crc-debug-cq6dl" podStartSLOduration=0.945963433 podStartE2EDuration="945.963433ms" podCreationTimestamp="2025-11-24 09:56:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:56:09.944563614 +0000 UTC m=+3147.203541062" watchObservedRunningTime="2025-11-24 09:56:09.945963433 +0000 UTC m=+3147.204940880" Nov 24 09:56:18 crc kubenswrapper[4563]: I1124 09:56:18.054436 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:56:18 crc kubenswrapper[4563]: E1124 09:56:18.055066 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:56:33 crc kubenswrapper[4563]: I1124 09:56:33.063688 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:56:33 crc kubenswrapper[4563]: E1124 09:56:33.064398 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:56:35 crc kubenswrapper[4563]: I1124 09:56:35.119270 4563 generic.go:334] "Generic (PLEG): container finished" podID="b787d135-f802-4a18-bdbb-7d775f277afe" containerID="dd847fd3b8d268414fba2adf2d728849e2b078cafc8c0d8959f67bfa01218c38" exitCode=0 Nov 24 09:56:35 crc kubenswrapper[4563]: I1124 09:56:35.119314 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h7gkj/crc-debug-cq6dl" event={"ID":"b787d135-f802-4a18-bdbb-7d775f277afe","Type":"ContainerDied","Data":"dd847fd3b8d268414fba2adf2d728849e2b078cafc8c0d8959f67bfa01218c38"} Nov 24 09:56:36 crc kubenswrapper[4563]: I1124 09:56:36.201342 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/crc-debug-cq6dl" Nov 24 09:56:36 crc kubenswrapper[4563]: I1124 09:56:36.230325 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-h7gkj/crc-debug-cq6dl"] Nov 24 09:56:36 crc kubenswrapper[4563]: I1124 09:56:36.234985 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-h7gkj/crc-debug-cq6dl"] Nov 24 09:56:36 crc kubenswrapper[4563]: I1124 09:56:36.331724 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b787d135-f802-4a18-bdbb-7d775f277afe-host\") pod \"b787d135-f802-4a18-bdbb-7d775f277afe\" (UID: \"b787d135-f802-4a18-bdbb-7d775f277afe\") " Nov 24 09:56:36 crc kubenswrapper[4563]: I1124 09:56:36.331742 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b787d135-f802-4a18-bdbb-7d775f277afe-host" (OuterVolumeSpecName: "host") pod "b787d135-f802-4a18-bdbb-7d775f277afe" (UID: "b787d135-f802-4a18-bdbb-7d775f277afe"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:56:36 crc kubenswrapper[4563]: I1124 09:56:36.331860 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8k58\" (UniqueName: \"kubernetes.io/projected/b787d135-f802-4a18-bdbb-7d775f277afe-kube-api-access-h8k58\") pod \"b787d135-f802-4a18-bdbb-7d775f277afe\" (UID: \"b787d135-f802-4a18-bdbb-7d775f277afe\") " Nov 24 09:56:36 crc kubenswrapper[4563]: I1124 09:56:36.332681 4563 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b787d135-f802-4a18-bdbb-7d775f277afe-host\") on node \"crc\" DevicePath \"\"" Nov 24 09:56:36 crc kubenswrapper[4563]: I1124 09:56:36.336590 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b787d135-f802-4a18-bdbb-7d775f277afe-kube-api-access-h8k58" (OuterVolumeSpecName: "kube-api-access-h8k58") pod "b787d135-f802-4a18-bdbb-7d775f277afe" (UID: "b787d135-f802-4a18-bdbb-7d775f277afe"). InnerVolumeSpecName "kube-api-access-h8k58". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:56:36 crc kubenswrapper[4563]: I1124 09:56:36.435306 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8k58\" (UniqueName: \"kubernetes.io/projected/b787d135-f802-4a18-bdbb-7d775f277afe-kube-api-access-h8k58\") on node \"crc\" DevicePath \"\"" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.063324 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b787d135-f802-4a18-bdbb-7d775f277afe" path="/var/lib/kubelet/pods/b787d135-f802-4a18-bdbb-7d775f277afe/volumes" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.140081 4563 scope.go:117] "RemoveContainer" containerID="dd847fd3b8d268414fba2adf2d728849e2b078cafc8c0d8959f67bfa01218c38" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.140234 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/crc-debug-cq6dl" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.346739 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h7gkj/crc-debug-8mkqb"] Nov 24 09:56:37 crc kubenswrapper[4563]: E1124 09:56:37.348480 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b787d135-f802-4a18-bdbb-7d775f277afe" containerName="container-00" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.348568 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b787d135-f802-4a18-bdbb-7d775f277afe" containerName="container-00" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.349096 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="b787d135-f802-4a18-bdbb-7d775f277afe" containerName="container-00" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.349855 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/crc-debug-8mkqb" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.352287 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-h7gkj"/"default-dockercfg-6wfzz" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.353158 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtgq8\" (UniqueName: \"kubernetes.io/projected/544409da-ab29-4a2c-bf2b-6f2404c64422-kube-api-access-jtgq8\") pod \"crc-debug-8mkqb\" (UID: \"544409da-ab29-4a2c-bf2b-6f2404c64422\") " pod="openshift-must-gather-h7gkj/crc-debug-8mkqb" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.353270 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/544409da-ab29-4a2c-bf2b-6f2404c64422-host\") pod \"crc-debug-8mkqb\" (UID: \"544409da-ab29-4a2c-bf2b-6f2404c64422\") " pod="openshift-must-gather-h7gkj/crc-debug-8mkqb" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.455719 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtgq8\" (UniqueName: \"kubernetes.io/projected/544409da-ab29-4a2c-bf2b-6f2404c64422-kube-api-access-jtgq8\") pod \"crc-debug-8mkqb\" (UID: \"544409da-ab29-4a2c-bf2b-6f2404c64422\") " pod="openshift-must-gather-h7gkj/crc-debug-8mkqb" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.456098 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/544409da-ab29-4a2c-bf2b-6f2404c64422-host\") pod \"crc-debug-8mkqb\" (UID: \"544409da-ab29-4a2c-bf2b-6f2404c64422\") " pod="openshift-must-gather-h7gkj/crc-debug-8mkqb" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.456149 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/544409da-ab29-4a2c-bf2b-6f2404c64422-host\") pod \"crc-debug-8mkqb\" (UID: \"544409da-ab29-4a2c-bf2b-6f2404c64422\") " pod="openshift-must-gather-h7gkj/crc-debug-8mkqb" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.482992 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtgq8\" (UniqueName: \"kubernetes.io/projected/544409da-ab29-4a2c-bf2b-6f2404c64422-kube-api-access-jtgq8\") pod \"crc-debug-8mkqb\" (UID: \"544409da-ab29-4a2c-bf2b-6f2404c64422\") " pod="openshift-must-gather-h7gkj/crc-debug-8mkqb" Nov 24 09:56:37 crc kubenswrapper[4563]: I1124 09:56:37.670335 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/crc-debug-8mkqb" Nov 24 09:56:37 crc kubenswrapper[4563]: W1124 09:56:37.715104 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod544409da_ab29_4a2c_bf2b_6f2404c64422.slice/crio-b30b72005cb10f7c9511f317421efc350fff4af797ba07855fe2b37118c2e548 WatchSource:0}: Error finding container b30b72005cb10f7c9511f317421efc350fff4af797ba07855fe2b37118c2e548: Status 404 returned error can't find the container with id b30b72005cb10f7c9511f317421efc350fff4af797ba07855fe2b37118c2e548 Nov 24 09:56:38 crc kubenswrapper[4563]: I1124 09:56:38.147832 4563 generic.go:334] "Generic (PLEG): container finished" podID="544409da-ab29-4a2c-bf2b-6f2404c64422" containerID="2dae0a1407577fae9291deaa4016d8e6a1e5725f568a28eb7c64cbccb7aec9b5" exitCode=0 Nov 24 09:56:38 crc kubenswrapper[4563]: I1124 09:56:38.147902 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h7gkj/crc-debug-8mkqb" event={"ID":"544409da-ab29-4a2c-bf2b-6f2404c64422","Type":"ContainerDied","Data":"2dae0a1407577fae9291deaa4016d8e6a1e5725f568a28eb7c64cbccb7aec9b5"} Nov 24 09:56:38 crc kubenswrapper[4563]: I1124 09:56:38.148151 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h7gkj/crc-debug-8mkqb" event={"ID":"544409da-ab29-4a2c-bf2b-6f2404c64422","Type":"ContainerStarted","Data":"b30b72005cb10f7c9511f317421efc350fff4af797ba07855fe2b37118c2e548"} Nov 24 09:56:38 crc kubenswrapper[4563]: I1124 09:56:38.559980 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-h7gkj/crc-debug-8mkqb"] Nov 24 09:56:38 crc kubenswrapper[4563]: I1124 09:56:38.567538 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-h7gkj/crc-debug-8mkqb"] Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.223599 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/crc-debug-8mkqb" Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.382449 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtgq8\" (UniqueName: \"kubernetes.io/projected/544409da-ab29-4a2c-bf2b-6f2404c64422-kube-api-access-jtgq8\") pod \"544409da-ab29-4a2c-bf2b-6f2404c64422\" (UID: \"544409da-ab29-4a2c-bf2b-6f2404c64422\") " Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.382768 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/544409da-ab29-4a2c-bf2b-6f2404c64422-host\") pod \"544409da-ab29-4a2c-bf2b-6f2404c64422\" (UID: \"544409da-ab29-4a2c-bf2b-6f2404c64422\") " Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.382845 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/544409da-ab29-4a2c-bf2b-6f2404c64422-host" (OuterVolumeSpecName: "host") pod "544409da-ab29-4a2c-bf2b-6f2404c64422" (UID: "544409da-ab29-4a2c-bf2b-6f2404c64422"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.383108 4563 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/544409da-ab29-4a2c-bf2b-6f2404c64422-host\") on node \"crc\" DevicePath \"\"" Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.388173 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/544409da-ab29-4a2c-bf2b-6f2404c64422-kube-api-access-jtgq8" (OuterVolumeSpecName: "kube-api-access-jtgq8") pod "544409da-ab29-4a2c-bf2b-6f2404c64422" (UID: "544409da-ab29-4a2c-bf2b-6f2404c64422"). InnerVolumeSpecName "kube-api-access-jtgq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.484934 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtgq8\" (UniqueName: \"kubernetes.io/projected/544409da-ab29-4a2c-bf2b-6f2404c64422-kube-api-access-jtgq8\") on node \"crc\" DevicePath \"\"" Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.706207 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h7gkj/crc-debug-vwb2x"] Nov 24 09:56:39 crc kubenswrapper[4563]: E1124 09:56:39.706564 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="544409da-ab29-4a2c-bf2b-6f2404c64422" containerName="container-00" Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.706577 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="544409da-ab29-4a2c-bf2b-6f2404c64422" containerName="container-00" Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.706778 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="544409da-ab29-4a2c-bf2b-6f2404c64422" containerName="container-00" Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.707338 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/crc-debug-vwb2x" Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.790151 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a7e2be0-612f-4055-a30d-843de4b14442-host\") pod \"crc-debug-vwb2x\" (UID: \"9a7e2be0-612f-4055-a30d-843de4b14442\") " pod="openshift-must-gather-h7gkj/crc-debug-vwb2x" Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.790350 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwtwd\" (UniqueName: \"kubernetes.io/projected/9a7e2be0-612f-4055-a30d-843de4b14442-kube-api-access-bwtwd\") pod \"crc-debug-vwb2x\" (UID: \"9a7e2be0-612f-4055-a30d-843de4b14442\") " pod="openshift-must-gather-h7gkj/crc-debug-vwb2x" Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.892672 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a7e2be0-612f-4055-a30d-843de4b14442-host\") pod \"crc-debug-vwb2x\" (UID: \"9a7e2be0-612f-4055-a30d-843de4b14442\") " pod="openshift-must-gather-h7gkj/crc-debug-vwb2x" Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.892846 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwtwd\" (UniqueName: \"kubernetes.io/projected/9a7e2be0-612f-4055-a30d-843de4b14442-kube-api-access-bwtwd\") pod \"crc-debug-vwb2x\" (UID: \"9a7e2be0-612f-4055-a30d-843de4b14442\") " pod="openshift-must-gather-h7gkj/crc-debug-vwb2x" Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.892996 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a7e2be0-612f-4055-a30d-843de4b14442-host\") pod \"crc-debug-vwb2x\" (UID: \"9a7e2be0-612f-4055-a30d-843de4b14442\") " pod="openshift-must-gather-h7gkj/crc-debug-vwb2x" Nov 24 09:56:39 crc kubenswrapper[4563]: I1124 09:56:39.907142 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwtwd\" (UniqueName: \"kubernetes.io/projected/9a7e2be0-612f-4055-a30d-843de4b14442-kube-api-access-bwtwd\") pod \"crc-debug-vwb2x\" (UID: \"9a7e2be0-612f-4055-a30d-843de4b14442\") " pod="openshift-must-gather-h7gkj/crc-debug-vwb2x" Nov 24 09:56:40 crc kubenswrapper[4563]: I1124 09:56:40.019662 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/crc-debug-vwb2x" Nov 24 09:56:40 crc kubenswrapper[4563]: W1124 09:56:40.039761 4563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a7e2be0_612f_4055_a30d_843de4b14442.slice/crio-52e488622dda9f5c8c7d415b5ab3703dbaa21ff0f7d255be3c90b38010c22cf8 WatchSource:0}: Error finding container 52e488622dda9f5c8c7d415b5ab3703dbaa21ff0f7d255be3c90b38010c22cf8: Status 404 returned error can't find the container with id 52e488622dda9f5c8c7d415b5ab3703dbaa21ff0f7d255be3c90b38010c22cf8 Nov 24 09:56:40 crc kubenswrapper[4563]: I1124 09:56:40.160772 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/crc-debug-8mkqb" Nov 24 09:56:40 crc kubenswrapper[4563]: I1124 09:56:40.160777 4563 scope.go:117] "RemoveContainer" containerID="2dae0a1407577fae9291deaa4016d8e6a1e5725f568a28eb7c64cbccb7aec9b5" Nov 24 09:56:40 crc kubenswrapper[4563]: I1124 09:56:40.162622 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h7gkj/crc-debug-vwb2x" event={"ID":"9a7e2be0-612f-4055-a30d-843de4b14442","Type":"ContainerStarted","Data":"71e24805e4b96d75ddc1ffac67525dd0fb8a26cad079716ba714d4bc09504bae"} Nov 24 09:56:40 crc kubenswrapper[4563]: I1124 09:56:40.162681 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h7gkj/crc-debug-vwb2x" event={"ID":"9a7e2be0-612f-4055-a30d-843de4b14442","Type":"ContainerStarted","Data":"52e488622dda9f5c8c7d415b5ab3703dbaa21ff0f7d255be3c90b38010c22cf8"} Nov 24 09:56:40 crc kubenswrapper[4563]: I1124 09:56:40.179603 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h7gkj/crc-debug-vwb2x" podStartSLOduration=1.179588283 podStartE2EDuration="1.179588283s" podCreationTimestamp="2025-11-24 09:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 09:56:40.173978287 +0000 UTC m=+3177.432955734" watchObservedRunningTime="2025-11-24 09:56:40.179588283 +0000 UTC m=+3177.438565730" Nov 24 09:56:41 crc kubenswrapper[4563]: I1124 09:56:41.081311 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="544409da-ab29-4a2c-bf2b-6f2404c64422" path="/var/lib/kubelet/pods/544409da-ab29-4a2c-bf2b-6f2404c64422/volumes" Nov 24 09:56:41 crc kubenswrapper[4563]: I1124 09:56:41.183266 4563 generic.go:334] "Generic (PLEG): container finished" podID="9a7e2be0-612f-4055-a30d-843de4b14442" containerID="71e24805e4b96d75ddc1ffac67525dd0fb8a26cad079716ba714d4bc09504bae" exitCode=0 Nov 24 09:56:41 crc kubenswrapper[4563]: I1124 09:56:41.183340 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h7gkj/crc-debug-vwb2x" event={"ID":"9a7e2be0-612f-4055-a30d-843de4b14442","Type":"ContainerDied","Data":"71e24805e4b96d75ddc1ffac67525dd0fb8a26cad079716ba714d4bc09504bae"} Nov 24 09:56:42 crc kubenswrapper[4563]: I1124 09:56:42.260100 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/crc-debug-vwb2x" Nov 24 09:56:42 crc kubenswrapper[4563]: I1124 09:56:42.295378 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-h7gkj/crc-debug-vwb2x"] Nov 24 09:56:42 crc kubenswrapper[4563]: I1124 09:56:42.300541 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-h7gkj/crc-debug-vwb2x"] Nov 24 09:56:42 crc kubenswrapper[4563]: I1124 09:56:42.443466 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwtwd\" (UniqueName: \"kubernetes.io/projected/9a7e2be0-612f-4055-a30d-843de4b14442-kube-api-access-bwtwd\") pod \"9a7e2be0-612f-4055-a30d-843de4b14442\" (UID: \"9a7e2be0-612f-4055-a30d-843de4b14442\") " Nov 24 09:56:42 crc kubenswrapper[4563]: I1124 09:56:42.443869 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a7e2be0-612f-4055-a30d-843de4b14442-host\") pod \"9a7e2be0-612f-4055-a30d-843de4b14442\" (UID: \"9a7e2be0-612f-4055-a30d-843de4b14442\") " Nov 24 09:56:42 crc kubenswrapper[4563]: I1124 09:56:42.443962 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a7e2be0-612f-4055-a30d-843de4b14442-host" (OuterVolumeSpecName: "host") pod "9a7e2be0-612f-4055-a30d-843de4b14442" (UID: "9a7e2be0-612f-4055-a30d-843de4b14442"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 24 09:56:42 crc kubenswrapper[4563]: I1124 09:56:42.444531 4563 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a7e2be0-612f-4055-a30d-843de4b14442-host\") on node \"crc\" DevicePath \"\"" Nov 24 09:56:42 crc kubenswrapper[4563]: I1124 09:56:42.449889 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a7e2be0-612f-4055-a30d-843de4b14442-kube-api-access-bwtwd" (OuterVolumeSpecName: "kube-api-access-bwtwd") pod "9a7e2be0-612f-4055-a30d-843de4b14442" (UID: "9a7e2be0-612f-4055-a30d-843de4b14442"). InnerVolumeSpecName "kube-api-access-bwtwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:56:42 crc kubenswrapper[4563]: I1124 09:56:42.546216 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwtwd\" (UniqueName: \"kubernetes.io/projected/9a7e2be0-612f-4055-a30d-843de4b14442-kube-api-access-bwtwd\") on node \"crc\" DevicePath \"\"" Nov 24 09:56:43 crc kubenswrapper[4563]: I1124 09:56:43.067481 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a7e2be0-612f-4055-a30d-843de4b14442" path="/var/lib/kubelet/pods/9a7e2be0-612f-4055-a30d-843de4b14442/volumes" Nov 24 09:56:43 crc kubenswrapper[4563]: I1124 09:56:43.198060 4563 scope.go:117] "RemoveContainer" containerID="71e24805e4b96d75ddc1ffac67525dd0fb8a26cad079716ba714d4bc09504bae" Nov 24 09:56:43 crc kubenswrapper[4563]: I1124 09:56:43.198623 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/crc-debug-vwb2x" Nov 24 09:56:47 crc kubenswrapper[4563]: I1124 09:56:47.055659 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:56:47 crc kubenswrapper[4563]: E1124 09:56:47.056400 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:56:57 crc kubenswrapper[4563]: I1124 09:56:57.400130 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b876f5fb4-sx5lp_29588100-1198-4e82-a1c3-87d27b71aa65/barbican-api/0.log" Nov 24 09:56:57 crc kubenswrapper[4563]: I1124 09:56:57.474713 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b876f5fb4-sx5lp_29588100-1198-4e82-a1c3-87d27b71aa65/barbican-api-log/0.log" Nov 24 09:56:57 crc kubenswrapper[4563]: I1124 09:56:57.546598 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64cbb46c46-7fsmj_17d8ec67-c825-4ab0-bd77-cd610ff6838e/barbican-keystone-listener/0.log" Nov 24 09:56:57 crc kubenswrapper[4563]: I1124 09:56:57.591027 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-64cbb46c46-7fsmj_17d8ec67-c825-4ab0-bd77-cd610ff6838e/barbican-keystone-listener-log/0.log" Nov 24 09:56:57 crc kubenswrapper[4563]: I1124 09:56:57.684097 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d65456589-92q4d_9d88e05b-2750-483f-a0a3-5169e4cc919c/barbican-worker/0.log" Nov 24 09:56:57 crc kubenswrapper[4563]: I1124 09:56:57.726105 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d65456589-92q4d_9d88e05b-2750-483f-a0a3-5169e4cc919c/barbican-worker-log/0.log" Nov 24 09:56:57 crc kubenswrapper[4563]: I1124 09:56:57.842190 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-xt5t2_1c07ab91-ccc4-46d0-b15c-0d20675fc19a/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:56:57 crc kubenswrapper[4563]: I1124 09:56:57.900626 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_86acf291-2839-49f7-aaf3-33ba6e0cae2e/ceilometer-central-agent/0.log" Nov 24 09:56:57 crc kubenswrapper[4563]: I1124 09:56:57.991488 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_86acf291-2839-49f7-aaf3-33ba6e0cae2e/ceilometer-notification-agent/0.log" Nov 24 09:56:57 crc kubenswrapper[4563]: I1124 09:56:57.999351 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_86acf291-2839-49f7-aaf3-33ba6e0cae2e/proxy-httpd/0.log" Nov 24 09:56:58 crc kubenswrapper[4563]: I1124 09:56:58.008436 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_86acf291-2839-49f7-aaf3-33ba6e0cae2e/sg-core/0.log" Nov 24 09:56:58 crc kubenswrapper[4563]: I1124 09:56:58.152289 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dd5367ce-55f6-4685-b414-4ef54ce7df7a/cinder-api-log/0.log" Nov 24 09:56:58 crc kubenswrapper[4563]: I1124 09:56:58.186543 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dd5367ce-55f6-4685-b414-4ef54ce7df7a/cinder-api/0.log" Nov 24 09:56:58 crc kubenswrapper[4563]: I1124 09:56:58.258300 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3/cinder-scheduler/0.log" Nov 24 09:56:58 crc kubenswrapper[4563]: I1124 09:56:58.328085 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0fc67a4e-40c9-4e4f-87fb-8cc10e65fda3/probe/0.log" Nov 24 09:56:58 crc kubenswrapper[4563]: I1124 09:56:58.376197 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6mqmj_ea6d045d-1394-436f-9329-9f3a9d10610b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:56:58 crc kubenswrapper[4563]: I1124 09:56:58.487778 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-64qvz_dd51baae-5c71-4421-9cc1-1095c3bba2e9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:56:58 crc kubenswrapper[4563]: I1124 09:56:58.562236 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-64858ddbd7-mtmng_34b79993-dd96-4594-a00f-3ca0dd207e62/init/0.log" Nov 24 09:56:58 crc kubenswrapper[4563]: I1124 09:56:58.697248 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-64858ddbd7-mtmng_34b79993-dd96-4594-a00f-3ca0dd207e62/init/0.log" Nov 24 09:56:58 crc kubenswrapper[4563]: I1124 09:56:58.722598 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-64858ddbd7-mtmng_34b79993-dd96-4594-a00f-3ca0dd207e62/dnsmasq-dns/0.log" Nov 24 09:56:58 crc kubenswrapper[4563]: I1124 09:56:58.744034 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-srkpz_75442289-63cd-4b6c-b86d-70ab08ae8dc2/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:56:58 crc kubenswrapper[4563]: I1124 09:56:58.885466 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f6624aa1-6acc-43a1-944e-20a77c1b09d9/glance-log/0.log" Nov 24 09:56:58 crc kubenswrapper[4563]: I1124 09:56:58.893878 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f6624aa1-6acc-43a1-944e-20a77c1b09d9/glance-httpd/0.log" Nov 24 09:56:59 crc kubenswrapper[4563]: I1124 09:56:59.009217 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b57487f8-d1f8-4f97-b92e-7385ecc88074/glance-httpd/0.log" Nov 24 09:56:59 crc kubenswrapper[4563]: I1124 09:56:59.030944 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b57487f8-d1f8-4f97-b92e-7385ecc88074/glance-log/0.log" Nov 24 09:56:59 crc kubenswrapper[4563]: I1124 09:56:59.170278 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d999bbd6-cqj6s_a7688cb4-70ea-43e4-85f2-6b96f972538f/horizon/0.log" Nov 24 09:56:59 crc kubenswrapper[4563]: I1124 09:56:59.217553 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-p4wzr_aa741fe2-400c-479c-bfb3-0d5273b064e2/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:56:59 crc kubenswrapper[4563]: I1124 09:56:59.391260 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d999bbd6-cqj6s_a7688cb4-70ea-43e4-85f2-6b96f972538f/horizon-log/0.log" Nov 24 09:56:59 crc kubenswrapper[4563]: I1124 09:56:59.423880 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4bvkr_424c4e83-a3c6-4eea-958e-e0cf83f20fdf/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:56:59 crc kubenswrapper[4563]: I1124 09:56:59.561323 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_86c40cc3-1c2a-47db-9ed2-eb746b65ac4b/kube-state-metrics/0.log" Nov 24 09:56:59 crc kubenswrapper[4563]: I1124 09:56:59.638551 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6565cb8596-rhwtd_d2cbaf0e-2aaa-445b-9d5f-82b7f03e51c1/keystone-api/0.log" Nov 24 09:56:59 crc kubenswrapper[4563]: I1124 09:56:59.729476 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-z8q62_96a07419-7337-47f5-89aa-233e06eec048/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:56:59 crc kubenswrapper[4563]: I1124 09:56:59.999687 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85d84cd957-f2sp9_5c5b560e-1f0c-4469-8455-1aec5e7653bd/neutron-httpd/0.log" Nov 24 09:57:00 crc kubenswrapper[4563]: I1124 09:57:00.039932 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85d84cd957-f2sp9_5c5b560e-1f0c-4469-8455-1aec5e7653bd/neutron-api/0.log" Nov 24 09:57:00 crc kubenswrapper[4563]: I1124 09:57:00.131329 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-qlbv2_5c13fe46-9855-4291-b685-df5de9abafa7/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:57:00 crc kubenswrapper[4563]: I1124 09:57:00.506734 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6/nova-api-log/0.log" Nov 24 09:57:00 crc kubenswrapper[4563]: I1124 09:57:00.581403 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a7e5bec0-7f94-410f-9344-aaa699457924/nova-cell0-conductor-conductor/0.log" Nov 24 09:57:00 crc kubenswrapper[4563]: I1124 09:57:00.710722 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3da51b3b-6286-4cb5-bcc7-d715eb2fe2a6/nova-api-api/0.log" Nov 24 09:57:00 crc kubenswrapper[4563]: I1124 09:57:00.762596 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_89f4e9e2-2f29-4076-a9e3-8513bfd1e07e/nova-cell1-conductor-conductor/0.log" Nov 24 09:57:00 crc kubenswrapper[4563]: I1124 09:57:00.872980 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_15b68912-1886-4162-88c8-02a37d34c54a/nova-cell1-novncproxy-novncproxy/0.log" Nov 24 09:57:00 crc kubenswrapper[4563]: I1124 09:57:00.911166 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-nx4nm_49ccd723-5c1a-4763-9eb4-5aed7651bad5/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:57:01 crc kubenswrapper[4563]: I1124 09:57:01.118364 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c17fb818-0e53-4655-89ac-a1bb9022b5f8/nova-metadata-log/0.log" Nov 24 09:57:01 crc kubenswrapper[4563]: I1124 09:57:01.308476 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_38360dce-8f0e-42b1-ba4c-d13036b2794a/nova-scheduler-scheduler/0.log" Nov 24 09:57:01 crc kubenswrapper[4563]: I1124 09:57:01.321038 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2c2b6368-21fd-4c13-b008-5fe4be95dc8d/mysql-bootstrap/0.log" Nov 24 09:57:01 crc kubenswrapper[4563]: I1124 09:57:01.473610 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2c2b6368-21fd-4c13-b008-5fe4be95dc8d/mysql-bootstrap/0.log" Nov 24 09:57:01 crc kubenswrapper[4563]: I1124 09:57:01.508422 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2c2b6368-21fd-4c13-b008-5fe4be95dc8d/galera/0.log" Nov 24 09:57:01 crc kubenswrapper[4563]: I1124 09:57:01.620333 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b0de325e-9aea-4ee2-9cc4-093f3d8d3f65/mysql-bootstrap/0.log" Nov 24 09:57:01 crc kubenswrapper[4563]: I1124 09:57:01.831269 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b0de325e-9aea-4ee2-9cc4-093f3d8d3f65/galera/0.log" Nov 24 09:57:01 crc kubenswrapper[4563]: I1124 09:57:01.840440 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b0de325e-9aea-4ee2-9cc4-093f3d8d3f65/mysql-bootstrap/0.log" Nov 24 09:57:01 crc kubenswrapper[4563]: I1124 09:57:01.957011 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c17fb818-0e53-4655-89ac-a1bb9022b5f8/nova-metadata-metadata/0.log" Nov 24 09:57:02 crc kubenswrapper[4563]: I1124 09:57:02.008912 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_9b1ff524-d9dc-4433-a21c-f6d00e3b89d4/openstackclient/0.log" Nov 24 09:57:02 crc kubenswrapper[4563]: I1124 09:57:02.009209 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-cblxg_d5d57856-0858-4ef6-86b1-282d4bc462be/openstack-network-exporter/0.log" Nov 24 09:57:02 crc kubenswrapper[4563]: I1124 09:57:02.055106 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:57:02 crc kubenswrapper[4563]: E1124 09:57:02.056227 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:57:02 crc kubenswrapper[4563]: I1124 09:57:02.153365 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6z24f_01d7f46a-ff30-4904-a63a-8d41cea54dd7/ovsdb-server-init/0.log" Nov 24 09:57:02 crc kubenswrapper[4563]: I1124 09:57:02.338119 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6z24f_01d7f46a-ff30-4904-a63a-8d41cea54dd7/ovsdb-server/0.log" Nov 24 09:57:02 crc kubenswrapper[4563]: I1124 09:57:02.338862 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6z24f_01d7f46a-ff30-4904-a63a-8d41cea54dd7/ovsdb-server-init/0.log" Nov 24 09:57:02 crc kubenswrapper[4563]: I1124 09:57:02.356863 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6z24f_01d7f46a-ff30-4904-a63a-8d41cea54dd7/ovs-vswitchd/0.log" Nov 24 09:57:02 crc kubenswrapper[4563]: I1124 09:57:02.491309 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-qtfnl_241e854a-eb29-4933-98be-bad6b9295260/ovn-controller/0.log" Nov 24 09:57:02 crc kubenswrapper[4563]: I1124 09:57:02.579673 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-tt4mm_5ab3a15a-af4c-43a3-9d3a-1515e2c8228b/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:57:02 crc kubenswrapper[4563]: I1124 09:57:02.656705 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c78e2b4d-f2bf-435e-b163-c9415021f43c/openstack-network-exporter/0.log" Nov 24 09:57:02 crc kubenswrapper[4563]: I1124 09:57:02.745897 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c78e2b4d-f2bf-435e-b163-c9415021f43c/ovn-northd/0.log" Nov 24 09:57:02 crc kubenswrapper[4563]: I1124 09:57:02.834740 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3399d213-46c4-42c1-9d69-26246c4ed771/ovsdbserver-nb/0.log" Nov 24 09:57:02 crc kubenswrapper[4563]: I1124 09:57:02.856695 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3399d213-46c4-42c1-9d69-26246c4ed771/openstack-network-exporter/0.log" Nov 24 09:57:02 crc kubenswrapper[4563]: I1124 09:57:02.980878 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_04022879-4b41-4e57-ae94-a3517d382e7d/ovsdbserver-sb/0.log" Nov 24 09:57:03 crc kubenswrapper[4563]: I1124 09:57:03.073966 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_04022879-4b41-4e57-ae94-a3517d382e7d/openstack-network-exporter/0.log" Nov 24 09:57:03 crc kubenswrapper[4563]: I1124 09:57:03.263134 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-748c4bdffd-w974j_9103bb32-e426-4c4b-ade8-d3430cf5ca11/placement-api/0.log" Nov 24 09:57:03 crc kubenswrapper[4563]: I1124 09:57:03.290489 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-748c4bdffd-w974j_9103bb32-e426-4c4b-ade8-d3430cf5ca11/placement-log/0.log" Nov 24 09:57:03 crc kubenswrapper[4563]: I1124 09:57:03.306413 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62faa658-2c71-4afe-9fc2-4d9fd0079928/setup-container/0.log" Nov 24 09:57:03 crc kubenswrapper[4563]: I1124 09:57:03.472875 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62faa658-2c71-4afe-9fc2-4d9fd0079928/setup-container/0.log" Nov 24 09:57:03 crc kubenswrapper[4563]: I1124 09:57:03.495793 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62faa658-2c71-4afe-9fc2-4d9fd0079928/rabbitmq/0.log" Nov 24 09:57:03 crc kubenswrapper[4563]: I1124 09:57:03.584326 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_45003ba2-beec-43e7-9248-42c517ed3bf7/setup-container/0.log" Nov 24 09:57:03 crc kubenswrapper[4563]: I1124 09:57:03.742341 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_45003ba2-beec-43e7-9248-42c517ed3bf7/setup-container/0.log" Nov 24 09:57:03 crc kubenswrapper[4563]: I1124 09:57:03.757971 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_45003ba2-beec-43e7-9248-42c517ed3bf7/rabbitmq/0.log" Nov 24 09:57:03 crc kubenswrapper[4563]: I1124 09:57:03.818325 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-m7shw_bfd3077a-ae15-4aa6-8a14-6edb2c4f25d0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.004964 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-tmkfw_96dc1f15-b31a-4eb6-91e7-35b341f1347a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.044460 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-ps7bx_c3cdb156-f67f-4dd2-b04d-9fb263802321/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.206236 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-mbq9m_5f2b4785-aae5-4031-9e66-c3601ef67b6a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.263718 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-4lklp_77470e38-d989-4832-8cac-4b2f1a8f2d14/ssh-known-hosts-edpm-deployment/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.408013 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5487cdb76f-rn9rx_64984138-1ff3-4d53-b4b9-e301fc5f2f80/proxy-server/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.515360 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5487cdb76f-rn9rx_64984138-1ff3-4d53-b4b9-e301fc5f2f80/proxy-httpd/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.551747 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mgbkw_874b4a65-f3cc-4bb7-9634-0a464700f823/swift-ring-rebalance/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.631898 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/account-auditor/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.700162 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/account-reaper/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.756029 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/account-replicator/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.771843 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/account-server/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.792037 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/container-auditor/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.893131 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/container-replicator/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.958048 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/container-server/0.log" Nov 24 09:57:04 crc kubenswrapper[4563]: I1124 09:57:04.963561 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/container-updater/0.log" Nov 24 09:57:05 crc kubenswrapper[4563]: I1124 09:57:05.000021 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/object-auditor/0.log" Nov 24 09:57:05 crc kubenswrapper[4563]: I1124 09:57:05.058472 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/object-expirer/0.log" Nov 24 09:57:05 crc kubenswrapper[4563]: I1124 09:57:05.150571 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/object-server/0.log" Nov 24 09:57:05 crc kubenswrapper[4563]: I1124 09:57:05.163233 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/object-replicator/0.log" Nov 24 09:57:05 crc kubenswrapper[4563]: I1124 09:57:05.169481 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/object-updater/0.log" Nov 24 09:57:05 crc kubenswrapper[4563]: I1124 09:57:05.222563 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/rsync/0.log" Nov 24 09:57:05 crc kubenswrapper[4563]: I1124 09:57:05.309201 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9d44fee0-139c-42c9-8ad1-3991121f1d67/swift-recon-cron/0.log" Nov 24 09:57:05 crc kubenswrapper[4563]: I1124 09:57:05.370740 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-wbb7z_0bcd5a3c-58dc-4ee9-b77d-a19d7dc36aca/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:57:05 crc kubenswrapper[4563]: I1124 09:57:05.509312 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d15e06ff-83ac-44e9-aebe-9756628722e6/tempest-tests-tempest-tests-runner/0.log" Nov 24 09:57:05 crc kubenswrapper[4563]: I1124 09:57:05.558326 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_b8a9086a-b375-485e-990a-27b6e4832c77/test-operator-logs-container/0.log" Nov 24 09:57:05 crc kubenswrapper[4563]: I1124 09:57:05.681140 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-6j7wm_72560768-c189-4eaa-9128-486ec369275b/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 24 09:57:14 crc kubenswrapper[4563]: I1124 09:57:14.054232 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:57:14 crc kubenswrapper[4563]: E1124 09:57:14.054907 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:57:16 crc kubenswrapper[4563]: I1124 09:57:16.496441 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_bdeec6b1-05d8-4275-839f-a02e22e26f61/memcached/0.log" Nov 24 09:57:25 crc kubenswrapper[4563]: I1124 09:57:25.386540 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d_8d314d9f-2d34-4e4c-899e-a113c55ad0df/util/0.log" Nov 24 09:57:25 crc kubenswrapper[4563]: I1124 09:57:25.542213 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d_8d314d9f-2d34-4e4c-899e-a113c55ad0df/pull/0.log" Nov 24 09:57:25 crc kubenswrapper[4563]: I1124 09:57:25.568949 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d_8d314d9f-2d34-4e4c-899e-a113c55ad0df/util/0.log" Nov 24 09:57:25 crc kubenswrapper[4563]: I1124 09:57:25.599539 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d_8d314d9f-2d34-4e4c-899e-a113c55ad0df/pull/0.log" Nov 24 09:57:25 crc kubenswrapper[4563]: I1124 09:57:25.747376 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d_8d314d9f-2d34-4e4c-899e-a113c55ad0df/extract/0.log" Nov 24 09:57:25 crc kubenswrapper[4563]: I1124 09:57:25.765047 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d_8d314d9f-2d34-4e4c-899e-a113c55ad0df/pull/0.log" Nov 24 09:57:25 crc kubenswrapper[4563]: I1124 09:57:25.766206 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ca9b138781dcf125934bc878376abf75f877c2252ee8cf8f3500b7287lkf9d_8d314d9f-2d34-4e4c-899e-a113c55ad0df/util/0.log" Nov 24 09:57:25 crc kubenswrapper[4563]: I1124 09:57:25.882286 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7768f8c84f-glf4s_f81c148e-bf8e-4b57-895e-f2c11411cf7a/kube-rbac-proxy/0.log" Nov 24 09:57:25 crc kubenswrapper[4563]: I1124 09:57:25.980459 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7768f8c84f-glf4s_f81c148e-bf8e-4b57-895e-f2c11411cf7a/manager/0.log" Nov 24 09:57:25 crc kubenswrapper[4563]: I1124 09:57:25.995000 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6d8fd67bf7-jnx9f_a62a6523-e592-437f-b3ba-320e24f619dc/kube-rbac-proxy/0.log" Nov 24 09:57:26 crc kubenswrapper[4563]: I1124 09:57:26.055491 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:57:26 crc kubenswrapper[4563]: E1124 09:57:26.055900 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:57:26 crc kubenswrapper[4563]: I1124 09:57:26.124651 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6d8fd67bf7-jnx9f_a62a6523-e592-437f-b3ba-320e24f619dc/manager/0.log" Nov 24 09:57:26 crc kubenswrapper[4563]: I1124 09:57:26.162203 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-56dfb6b67f-77wgb_17904228-d0e5-489c-a965-5cba44f3b3f2/manager/0.log" Nov 24 09:57:26 crc kubenswrapper[4563]: I1124 09:57:26.180604 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-56dfb6b67f-77wgb_17904228-d0e5-489c-a965-5cba44f3b3f2/kube-rbac-proxy/0.log" Nov 24 09:57:26 crc kubenswrapper[4563]: I1124 09:57:26.290849 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8667fbf6f6-k9wzp_77d539d7-5235-4576-a276-8247c5824020/kube-rbac-proxy/0.log" Nov 24 09:57:26 crc kubenswrapper[4563]: I1124 09:57:26.368897 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8667fbf6f6-k9wzp_77d539d7-5235-4576-a276-8247c5824020/manager/0.log" Nov 24 09:57:26 crc kubenswrapper[4563]: I1124 09:57:26.459816 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-bf4c6585d-tnxst_b4f4311c-5634-4bae-8659-5efa662f0562/kube-rbac-proxy/0.log" Nov 24 09:57:26 crc kubenswrapper[4563]: I1124 09:57:26.509955 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-bf4c6585d-tnxst_b4f4311c-5634-4bae-8659-5efa662f0562/manager/0.log" Nov 24 09:57:26 crc kubenswrapper[4563]: I1124 09:57:26.535546 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d86b44686-4x76m_70a63634-9a9f-46b3-af05-9dc02c0a03e1/kube-rbac-proxy/0.log" Nov 24 09:57:26 crc kubenswrapper[4563]: I1124 09:57:26.640258 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d86b44686-4x76m_70a63634-9a9f-46b3-af05-9dc02c0a03e1/manager/0.log" Nov 24 09:57:26 crc kubenswrapper[4563]: I1124 09:57:26.685671 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-769d9c7585-4f5hq_68eeb4a0-b192-4e6a-b02b-f34415b29316/kube-rbac-proxy/0.log" Nov 24 09:57:26 crc kubenswrapper[4563]: I1124 09:57:26.838899 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-769d9c7585-4f5hq_68eeb4a0-b192-4e6a-b02b-f34415b29316/manager/0.log" Nov 24 09:57:26 crc kubenswrapper[4563]: I1124 09:57:26.857564 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5c75d7c94b-ltqbl_26aa13a3-737a-457f-9d46-29018cfccd1e/kube-rbac-proxy/0.log" Nov 24 09:57:26 crc kubenswrapper[4563]: I1124 09:57:26.860177 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5c75d7c94b-ltqbl_26aa13a3-737a-457f-9d46-29018cfccd1e/manager/0.log" Nov 24 09:57:27 crc kubenswrapper[4563]: I1124 09:57:27.018561 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7879fb76fd-4tv9l_ebed0d67-0bac-4d1f-a2d0-2e367d78d157/kube-rbac-proxy/0.log" Nov 24 09:57:27 crc kubenswrapper[4563]: I1124 09:57:27.059431 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7879fb76fd-4tv9l_ebed0d67-0bac-4d1f-a2d0-2e367d78d157/manager/0.log" Nov 24 09:57:27 crc kubenswrapper[4563]: I1124 09:57:27.176437 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7bb88cb858-44jfn_13a3e7f4-4c3d-46e7-a7ee-612f9ac17bc7/kube-rbac-proxy/0.log" Nov 24 09:57:27 crc kubenswrapper[4563]: I1124 09:57:27.203205 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7bb88cb858-44jfn_13a3e7f4-4c3d-46e7-a7ee-612f9ac17bc7/manager/0.log" Nov 24 09:57:27 crc kubenswrapper[4563]: I1124 09:57:27.269951 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6f8c5b86cb-94tjk_a30aea9a-f4c8-42a3-89bb-af9ffef55544/kube-rbac-proxy/0.log" Nov 24 09:57:27 crc kubenswrapper[4563]: I1124 09:57:27.352105 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6f8c5b86cb-94tjk_a30aea9a-f4c8-42a3-89bb-af9ffef55544/manager/0.log" Nov 24 09:57:27 crc kubenswrapper[4563]: I1124 09:57:27.432414 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-66b7d6f598-fffcm_ffcb9e74-1697-402a-b77b-5a3ecc832759/kube-rbac-proxy/0.log" Nov 24 09:57:27 crc kubenswrapper[4563]: I1124 09:57:27.487190 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-66b7d6f598-fffcm_ffcb9e74-1697-402a-b77b-5a3ecc832759/manager/0.log" Nov 24 09:57:27 crc kubenswrapper[4563]: I1124 09:57:27.564320 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-86d796d84d-vkltr_c089c738-65b8-46e2-91c9-59b962081c05/kube-rbac-proxy/0.log" Nov 24 09:57:27 crc kubenswrapper[4563]: I1124 09:57:27.678795 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-86d796d84d-vkltr_c089c738-65b8-46e2-91c9-59b962081c05/manager/0.log" Nov 24 09:57:27 crc kubenswrapper[4563]: I1124 09:57:27.740968 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6fdc856c5d-h78s9_71d78263-9c76-454f-8b9f-1392c9fcfc2f/manager/0.log" Nov 24 09:57:27 crc kubenswrapper[4563]: I1124 09:57:27.749674 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6fdc856c5d-h78s9_71d78263-9c76-454f-8b9f-1392c9fcfc2f/kube-rbac-proxy/0.log" Nov 24 09:57:27 crc kubenswrapper[4563]: I1124 09:57:27.854662 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-79d88dcd444qmtr_974a1619-7c48-46d6-b639-5f965c6b747a/kube-rbac-proxy/0.log" Nov 24 09:57:27 crc kubenswrapper[4563]: I1124 09:57:27.919394 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-79d88dcd444qmtr_974a1619-7c48-46d6-b639-5f965c6b747a/manager/0.log" Nov 24 09:57:28 crc kubenswrapper[4563]: I1124 09:57:28.061039 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6cb9dc54f8-m7w2q_cdec1b8b-630a-452a-b4d9-3cd42ef204c7/kube-rbac-proxy/0.log" Nov 24 09:57:28 crc kubenswrapper[4563]: I1124 09:57:28.126383 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-8486c7f98b-xz8g7_56c65669-5fad-40b3-aec8-b459c3e6b0f8/kube-rbac-proxy/0.log" Nov 24 09:57:28 crc kubenswrapper[4563]: I1124 09:57:28.397984 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-8486c7f98b-xz8g7_56c65669-5fad-40b3-aec8-b459c3e6b0f8/operator/0.log" Nov 24 09:57:28 crc kubenswrapper[4563]: I1124 09:57:28.461132 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-5f5wm_0079c598-0bc4-4809-9813-0aa163a961a1/registry-server/0.log" Nov 24 09:57:28 crc kubenswrapper[4563]: I1124 09:57:28.558551 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5bdf4f7f7f-6n5jh_9fb1ddc7-1195-412e-93ed-4799bc756bae/kube-rbac-proxy/0.log" Nov 24 09:57:28 crc kubenswrapper[4563]: I1124 09:57:28.625844 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5bdf4f7f7f-6n5jh_9fb1ddc7-1195-412e-93ed-4799bc756bae/manager/0.log" Nov 24 09:57:28 crc kubenswrapper[4563]: I1124 09:57:28.736384 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-6dc664666c-6flr8_6a018387-ddf9-40f3-a421-d1a760581c8f/manager/0.log" Nov 24 09:57:28 crc kubenswrapper[4563]: I1124 09:57:28.750570 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-6dc664666c-6flr8_6a018387-ddf9-40f3-a421-d1a760581c8f/kube-rbac-proxy/0.log" Nov 24 09:57:28 crc kubenswrapper[4563]: I1124 09:57:28.894686 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-g64g6_00f5e4f8-193c-48df-b29f-8f359f263a5a/operator/0.log" Nov 24 09:57:28 crc kubenswrapper[4563]: I1124 09:57:28.944256 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-799cb6ffd6-wck8j_31e8d237-829e-47b0-8a2c-8e316a37dc78/kube-rbac-proxy/0.log" Nov 24 09:57:29 crc kubenswrapper[4563]: I1124 09:57:29.037560 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6cb9dc54f8-m7w2q_cdec1b8b-630a-452a-b4d9-3cd42ef204c7/manager/0.log" Nov 24 09:57:29 crc kubenswrapper[4563]: I1124 09:57:29.065123 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-799cb6ffd6-wck8j_31e8d237-829e-47b0-8a2c-8e316a37dc78/manager/0.log" Nov 24 09:57:29 crc kubenswrapper[4563]: I1124 09:57:29.270276 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7798859c74-z5b6f_9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f/kube-rbac-proxy/0.log" Nov 24 09:57:29 crc kubenswrapper[4563]: I1124 09:57:29.321890 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7798859c74-z5b6f_9d2aa6d4-db94-44dd-99f1-6e95e8de9a5f/manager/0.log" Nov 24 09:57:29 crc kubenswrapper[4563]: I1124 09:57:29.394738 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8464cf66df-chpfj_238f517b-0e10-411c-8b3c-c6bdbe261159/kube-rbac-proxy/0.log" Nov 24 09:57:29 crc kubenswrapper[4563]: I1124 09:57:29.415015 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8464cf66df-chpfj_238f517b-0e10-411c-8b3c-c6bdbe261159/manager/0.log" Nov 24 09:57:29 crc kubenswrapper[4563]: I1124 09:57:29.502070 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7cd4fb6f79-qhzw4_d5e12170-5cc0-4f8f-89d7-c64f38f2226e/kube-rbac-proxy/0.log" Nov 24 09:57:29 crc kubenswrapper[4563]: I1124 09:57:29.544358 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7cd4fb6f79-qhzw4_d5e12170-5cc0-4f8f-89d7-c64f38f2226e/manager/0.log" Nov 24 09:57:38 crc kubenswrapper[4563]: I1124 09:57:38.055081 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:57:38 crc kubenswrapper[4563]: E1124 09:57:38.055624 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:57:41 crc kubenswrapper[4563]: I1124 09:57:41.029226 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-z4lb4_378c7d30-dd7c-4aa5-83cf-7caca587f283/control-plane-machine-set-operator/0.log" Nov 24 09:57:41 crc kubenswrapper[4563]: I1124 09:57:41.145874 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bxdlb_48157749-8872-4c5b-b119-efe27cfd887e/kube-rbac-proxy/0.log" Nov 24 09:57:41 crc kubenswrapper[4563]: I1124 09:57:41.185936 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bxdlb_48157749-8872-4c5b-b119-efe27cfd887e/machine-api-operator/0.log" Nov 24 09:57:49 crc kubenswrapper[4563]: I1124 09:57:49.380176 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-bpkp8_271399d1-6304-4dcd-a3df-6c543849329e/cert-manager-controller/0.log" Nov 24 09:57:49 crc kubenswrapper[4563]: I1124 09:57:49.519479 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-mfw5p_dbb0b52d-058f-46a3-8342-811bd3f5b495/cert-manager-cainjector/0.log" Nov 24 09:57:49 crc kubenswrapper[4563]: I1124 09:57:49.546129 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-p6b8w_a32e3f9f-14d2-44fb-ba5a-9ede6e568643/cert-manager-webhook/0.log" Nov 24 09:57:50 crc kubenswrapper[4563]: I1124 09:57:50.055161 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:57:50 crc kubenswrapper[4563]: E1124 09:57:50.055382 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:57:57 crc kubenswrapper[4563]: I1124 09:57:57.473887 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-tqw7b_89298ce0-9e0a-4351-96a9-4b69233c7ba0/nmstate-console-plugin/0.log" Nov 24 09:57:57 crc kubenswrapper[4563]: I1124 09:57:57.600183 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jbdfx_71fe428e-199c-422c-8911-79d2a7d27ab1/nmstate-handler/0.log" Nov 24 09:57:57 crc kubenswrapper[4563]: I1124 09:57:57.633382 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-xf2lb_d0a1ac8a-df66-4ac5-9aed-a2001c905f21/kube-rbac-proxy/0.log" Nov 24 09:57:57 crc kubenswrapper[4563]: I1124 09:57:57.653130 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-xf2lb_d0a1ac8a-df66-4ac5-9aed-a2001c905f21/nmstate-metrics/0.log" Nov 24 09:57:57 crc kubenswrapper[4563]: I1124 09:57:57.741235 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-ppw2s_e4241626-5fdc-4620-9ffd-6bdc19046a33/nmstate-operator/0.log" Nov 24 09:57:57 crc kubenswrapper[4563]: I1124 09:57:57.797234 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-qz8gz_564d8757-3a04-48f3-b3a2-109930f83a10/nmstate-webhook/0.log" Nov 24 09:58:04 crc kubenswrapper[4563]: I1124 09:58:04.055438 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:58:04 crc kubenswrapper[4563]: E1124 09:58:04.056060 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.300946 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-7rql8_e3ae4470-f488-4bc7-b9e0-a37903b5400a/kube-rbac-proxy/0.log" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.453279 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-7rql8_e3ae4470-f488-4bc7-b9e0-a37903b5400a/controller/0.log" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.499106 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-frr-files/0.log" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.639243 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-frr-files/0.log" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.640185 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-metrics/0.log" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.647472 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-reloader/0.log" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.647589 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-reloader/0.log" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.789238 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-frr-files/0.log" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.793926 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-reloader/0.log" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.799620 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-metrics/0.log" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.826872 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-metrics/0.log" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.942473 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-frr-files/0.log" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.947721 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-reloader/0.log" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.970518 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/controller/0.log" Nov 24 09:58:07 crc kubenswrapper[4563]: I1124 09:58:07.972743 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/cp-metrics/0.log" Nov 24 09:58:08 crc kubenswrapper[4563]: I1124 09:58:08.087526 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/frr-metrics/0.log" Nov 24 09:58:08 crc kubenswrapper[4563]: I1124 09:58:08.129970 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/kube-rbac-proxy/0.log" Nov 24 09:58:08 crc kubenswrapper[4563]: I1124 09:58:08.147768 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/kube-rbac-proxy-frr/0.log" Nov 24 09:58:08 crc kubenswrapper[4563]: I1124 09:58:08.265627 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/reloader/0.log" Nov 24 09:58:08 crc kubenswrapper[4563]: I1124 09:58:08.345725 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-fchts_22afba2f-88ba-4b65-8f98-a024f676b896/frr-k8s-webhook-server/0.log" Nov 24 09:58:08 crc kubenswrapper[4563]: I1124 09:58:08.476838 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7cd8c86c8-w62w9_5784d9be-5a59-4204-829a-dc637bfb7d90/manager/0.log" Nov 24 09:58:08 crc kubenswrapper[4563]: I1124 09:58:08.647554 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-559746f898-fwz9n_29db6437-f6b7-4f7f-a855-33b7316b09f8/webhook-server/0.log" Nov 24 09:58:08 crc kubenswrapper[4563]: I1124 09:58:08.733822 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tddpp_3b0c98e8-df1b-485b-972d-2e2ff8103006/kube-rbac-proxy/0.log" Nov 24 09:58:09 crc kubenswrapper[4563]: I1124 09:58:09.229284 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tddpp_3b0c98e8-df1b-485b-972d-2e2ff8103006/speaker/0.log" Nov 24 09:58:09 crc kubenswrapper[4563]: I1124 09:58:09.264851 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qwb5s_54e0a364-1f3d-493b-8c11-2d59672a99e1/frr/0.log" Nov 24 09:58:17 crc kubenswrapper[4563]: I1124 09:58:17.055861 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:58:17 crc kubenswrapper[4563]: E1124 09:58:17.056745 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:58:17 crc kubenswrapper[4563]: I1124 09:58:17.901379 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc_e25e7e42-b065-4947-a1d0-3d641b371d06/util/0.log" Nov 24 09:58:18 crc kubenswrapper[4563]: I1124 09:58:18.020488 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc_e25e7e42-b065-4947-a1d0-3d641b371d06/util/0.log" Nov 24 09:58:18 crc kubenswrapper[4563]: I1124 09:58:18.137044 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc_e25e7e42-b065-4947-a1d0-3d641b371d06/pull/0.log" Nov 24 09:58:18 crc kubenswrapper[4563]: I1124 09:58:18.180227 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc_e25e7e42-b065-4947-a1d0-3d641b371d06/pull/0.log" Nov 24 09:58:18 crc kubenswrapper[4563]: I1124 09:58:18.324726 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc_e25e7e42-b065-4947-a1d0-3d641b371d06/util/0.log" Nov 24 09:58:18 crc kubenswrapper[4563]: I1124 09:58:18.341540 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc_e25e7e42-b065-4947-a1d0-3d641b371d06/pull/0.log" Nov 24 09:58:18 crc kubenswrapper[4563]: I1124 09:58:18.365336 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772emn9hc_e25e7e42-b065-4947-a1d0-3d641b371d06/extract/0.log" Nov 24 09:58:18 crc kubenswrapper[4563]: I1124 09:58:18.462651 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sf9ml_83e0ce45-d845-49ab-b393-af085b920737/extract-utilities/0.log" Nov 24 09:58:18 crc kubenswrapper[4563]: I1124 09:58:18.573797 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sf9ml_83e0ce45-d845-49ab-b393-af085b920737/extract-content/0.log" Nov 24 09:58:18 crc kubenswrapper[4563]: I1124 09:58:18.577908 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sf9ml_83e0ce45-d845-49ab-b393-af085b920737/extract-content/0.log" Nov 24 09:58:18 crc kubenswrapper[4563]: I1124 09:58:18.596415 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sf9ml_83e0ce45-d845-49ab-b393-af085b920737/extract-utilities/0.log" Nov 24 09:58:18 crc kubenswrapper[4563]: I1124 09:58:18.732013 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sf9ml_83e0ce45-d845-49ab-b393-af085b920737/extract-content/0.log" Nov 24 09:58:18 crc kubenswrapper[4563]: I1124 09:58:18.740301 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sf9ml_83e0ce45-d845-49ab-b393-af085b920737/extract-utilities/0.log" Nov 24 09:58:18 crc kubenswrapper[4563]: I1124 09:58:18.977363 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6cgfm_48f738ab-b1e0-4e09-baf3-3ed15d54151c/extract-utilities/0.log" Nov 24 09:58:19 crc kubenswrapper[4563]: I1124 09:58:19.082438 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6cgfm_48f738ab-b1e0-4e09-baf3-3ed15d54151c/extract-content/0.log" Nov 24 09:58:19 crc kubenswrapper[4563]: I1124 09:58:19.130285 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6cgfm_48f738ab-b1e0-4e09-baf3-3ed15d54151c/extract-utilities/0.log" Nov 24 09:58:19 crc kubenswrapper[4563]: I1124 09:58:19.172879 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sf9ml_83e0ce45-d845-49ab-b393-af085b920737/registry-server/0.log" Nov 24 09:58:19 crc kubenswrapper[4563]: I1124 09:58:19.197500 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6cgfm_48f738ab-b1e0-4e09-baf3-3ed15d54151c/extract-content/0.log" Nov 24 09:58:19 crc kubenswrapper[4563]: I1124 09:58:19.304021 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6cgfm_48f738ab-b1e0-4e09-baf3-3ed15d54151c/extract-utilities/0.log" Nov 24 09:58:19 crc kubenswrapper[4563]: I1124 09:58:19.335588 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6cgfm_48f738ab-b1e0-4e09-baf3-3ed15d54151c/extract-content/0.log" Nov 24 09:58:19 crc kubenswrapper[4563]: I1124 09:58:19.511510 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc_2e79caaa-09f9-4720-b80d-300d880d7e26/util/0.log" Nov 24 09:58:19 crc kubenswrapper[4563]: I1124 09:58:19.625281 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc_2e79caaa-09f9-4720-b80d-300d880d7e26/util/0.log" Nov 24 09:58:19 crc kubenswrapper[4563]: I1124 09:58:19.685568 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc_2e79caaa-09f9-4720-b80d-300d880d7e26/pull/0.log" Nov 24 09:58:19 crc kubenswrapper[4563]: I1124 09:58:19.719380 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc_2e79caaa-09f9-4720-b80d-300d880d7e26/pull/0.log" Nov 24 09:58:19 crc kubenswrapper[4563]: I1124 09:58:19.758383 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6cgfm_48f738ab-b1e0-4e09-baf3-3ed15d54151c/registry-server/0.log" Nov 24 09:58:19 crc kubenswrapper[4563]: I1124 09:58:19.868582 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc_2e79caaa-09f9-4720-b80d-300d880d7e26/util/0.log" Nov 24 09:58:19 crc kubenswrapper[4563]: I1124 09:58:19.888396 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc_2e79caaa-09f9-4720-b80d-300d880d7e26/pull/0.log" Nov 24 09:58:19 crc kubenswrapper[4563]: I1124 09:58:19.895180 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6r8khc_2e79caaa-09f9-4720-b80d-300d880d7e26/extract/0.log" Nov 24 09:58:20 crc kubenswrapper[4563]: I1124 09:58:20.019583 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-5pxln_157ed1a3-ea31-4a6b-8e91-2852d4c50600/marketplace-operator/0.log" Nov 24 09:58:20 crc kubenswrapper[4563]: I1124 09:58:20.043234 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nsq9v_cb0770b7-f971-4e96-ab32-1f41b4cd9885/extract-utilities/0.log" Nov 24 09:58:20 crc kubenswrapper[4563]: I1124 09:58:20.163191 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nsq9v_cb0770b7-f971-4e96-ab32-1f41b4cd9885/extract-content/0.log" Nov 24 09:58:20 crc kubenswrapper[4563]: I1124 09:58:20.170241 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nsq9v_cb0770b7-f971-4e96-ab32-1f41b4cd9885/extract-utilities/0.log" Nov 24 09:58:20 crc kubenswrapper[4563]: I1124 09:58:20.222497 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nsq9v_cb0770b7-f971-4e96-ab32-1f41b4cd9885/extract-content/0.log" Nov 24 09:58:20 crc kubenswrapper[4563]: I1124 09:58:20.319161 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nsq9v_cb0770b7-f971-4e96-ab32-1f41b4cd9885/extract-content/0.log" Nov 24 09:58:20 crc kubenswrapper[4563]: I1124 09:58:20.321533 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nsq9v_cb0770b7-f971-4e96-ab32-1f41b4cd9885/extract-utilities/0.log" Nov 24 09:58:20 crc kubenswrapper[4563]: I1124 09:58:20.454990 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nsq9v_cb0770b7-f971-4e96-ab32-1f41b4cd9885/registry-server/0.log" Nov 24 09:58:20 crc kubenswrapper[4563]: I1124 09:58:20.479366 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9lpl_e26141fd-4cfa-4726-ba65-1f3bb830411b/extract-utilities/0.log" Nov 24 09:58:20 crc kubenswrapper[4563]: I1124 09:58:20.659551 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9lpl_e26141fd-4cfa-4726-ba65-1f3bb830411b/extract-utilities/0.log" Nov 24 09:58:20 crc kubenswrapper[4563]: I1124 09:58:20.673672 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9lpl_e26141fd-4cfa-4726-ba65-1f3bb830411b/extract-content/0.log" Nov 24 09:58:20 crc kubenswrapper[4563]: I1124 09:58:20.684533 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9lpl_e26141fd-4cfa-4726-ba65-1f3bb830411b/extract-content/0.log" Nov 24 09:58:20 crc kubenswrapper[4563]: I1124 09:58:20.805181 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9lpl_e26141fd-4cfa-4726-ba65-1f3bb830411b/extract-content/0.log" Nov 24 09:58:20 crc kubenswrapper[4563]: I1124 09:58:20.817297 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9lpl_e26141fd-4cfa-4726-ba65-1f3bb830411b/extract-utilities/0.log" Nov 24 09:58:21 crc kubenswrapper[4563]: I1124 09:58:21.079630 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r9lpl_e26141fd-4cfa-4726-ba65-1f3bb830411b/registry-server/0.log" Nov 24 09:58:28 crc kubenswrapper[4563]: I1124 09:58:28.055039 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:58:28 crc kubenswrapper[4563]: E1124 09:58:28.055678 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:58:39 crc kubenswrapper[4563]: I1124 09:58:39.054628 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:58:39 crc kubenswrapper[4563]: E1124 09:58:39.055444 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:58:52 crc kubenswrapper[4563]: I1124 09:58:52.055730 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:58:52 crc kubenswrapper[4563]: E1124 09:58:52.057249 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:59:03 crc kubenswrapper[4563]: I1124 09:59:03.061908 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:59:03 crc kubenswrapper[4563]: E1124 09:59:03.064077 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:59:16 crc kubenswrapper[4563]: I1124 09:59:16.055127 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:59:16 crc kubenswrapper[4563]: E1124 09:59:16.056874 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:59:28 crc kubenswrapper[4563]: I1124 09:59:28.054130 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:59:28 crc kubenswrapper[4563]: E1124 09:59:28.055022 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:59:34 crc kubenswrapper[4563]: I1124 09:59:34.389260 4563 generic.go:334] "Generic (PLEG): container finished" podID="73563124-8005-419b-ae79-70a52a25a823" containerID="7c4f9ff8c22199340577363c0c05b7c2015f0a43cf29aec58e24be79a519a2f0" exitCode=0 Nov 24 09:59:34 crc kubenswrapper[4563]: I1124 09:59:34.389312 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h7gkj/must-gather-v87wf" event={"ID":"73563124-8005-419b-ae79-70a52a25a823","Type":"ContainerDied","Data":"7c4f9ff8c22199340577363c0c05b7c2015f0a43cf29aec58e24be79a519a2f0"} Nov 24 09:59:34 crc kubenswrapper[4563]: I1124 09:59:34.390047 4563 scope.go:117] "RemoveContainer" containerID="7c4f9ff8c22199340577363c0c05b7c2015f0a43cf29aec58e24be79a519a2f0" Nov 24 09:59:34 crc kubenswrapper[4563]: I1124 09:59:34.886530 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h7gkj_must-gather-v87wf_73563124-8005-419b-ae79-70a52a25a823/gather/0.log" Nov 24 09:59:42 crc kubenswrapper[4563]: I1124 09:59:42.054557 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:59:42 crc kubenswrapper[4563]: E1124 09:59:42.055508 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 09:59:44 crc kubenswrapper[4563]: I1124 09:59:44.340780 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-h7gkj/must-gather-v87wf"] Nov 24 09:59:44 crc kubenswrapper[4563]: I1124 09:59:44.341802 4563 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-h7gkj/must-gather-v87wf" podUID="73563124-8005-419b-ae79-70a52a25a823" containerName="copy" containerID="cri-o://a4fb22a31c4810ea4f9a967db90b4ba0697fa6e5cb579ad943ecfc162521b9c1" gracePeriod=2 Nov 24 09:59:44 crc kubenswrapper[4563]: I1124 09:59:44.353197 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-h7gkj/must-gather-v87wf"] Nov 24 09:59:44 crc kubenswrapper[4563]: I1124 09:59:44.721159 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h7gkj_must-gather-v87wf_73563124-8005-419b-ae79-70a52a25a823/copy/0.log" Nov 24 09:59:44 crc kubenswrapper[4563]: I1124 09:59:44.721706 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/must-gather-v87wf" Nov 24 09:59:44 crc kubenswrapper[4563]: I1124 09:59:44.758260 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73563124-8005-419b-ae79-70a52a25a823-must-gather-output\") pod \"73563124-8005-419b-ae79-70a52a25a823\" (UID: \"73563124-8005-419b-ae79-70a52a25a823\") " Nov 24 09:59:44 crc kubenswrapper[4563]: I1124 09:59:44.758529 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krlx9\" (UniqueName: \"kubernetes.io/projected/73563124-8005-419b-ae79-70a52a25a823-kube-api-access-krlx9\") pod \"73563124-8005-419b-ae79-70a52a25a823\" (UID: \"73563124-8005-419b-ae79-70a52a25a823\") " Nov 24 09:59:44 crc kubenswrapper[4563]: I1124 09:59:44.766775 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73563124-8005-419b-ae79-70a52a25a823-kube-api-access-krlx9" (OuterVolumeSpecName: "kube-api-access-krlx9") pod "73563124-8005-419b-ae79-70a52a25a823" (UID: "73563124-8005-419b-ae79-70a52a25a823"). InnerVolumeSpecName "kube-api-access-krlx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 09:59:44 crc kubenswrapper[4563]: I1124 09:59:44.861078 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krlx9\" (UniqueName: \"kubernetes.io/projected/73563124-8005-419b-ae79-70a52a25a823-kube-api-access-krlx9\") on node \"crc\" DevicePath \"\"" Nov 24 09:59:44 crc kubenswrapper[4563]: I1124 09:59:44.896214 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73563124-8005-419b-ae79-70a52a25a823-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "73563124-8005-419b-ae79-70a52a25a823" (UID: "73563124-8005-419b-ae79-70a52a25a823"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 24 09:59:44 crc kubenswrapper[4563]: I1124 09:59:44.963664 4563 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73563124-8005-419b-ae79-70a52a25a823-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 24 09:59:45 crc kubenswrapper[4563]: I1124 09:59:45.062382 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73563124-8005-419b-ae79-70a52a25a823" path="/var/lib/kubelet/pods/73563124-8005-419b-ae79-70a52a25a823/volumes" Nov 24 09:59:45 crc kubenswrapper[4563]: I1124 09:59:45.464239 4563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h7gkj_must-gather-v87wf_73563124-8005-419b-ae79-70a52a25a823/copy/0.log" Nov 24 09:59:45 crc kubenswrapper[4563]: I1124 09:59:45.464651 4563 generic.go:334] "Generic (PLEG): container finished" podID="73563124-8005-419b-ae79-70a52a25a823" containerID="a4fb22a31c4810ea4f9a967db90b4ba0697fa6e5cb579ad943ecfc162521b9c1" exitCode=143 Nov 24 09:59:45 crc kubenswrapper[4563]: I1124 09:59:45.464708 4563 scope.go:117] "RemoveContainer" containerID="a4fb22a31c4810ea4f9a967db90b4ba0697fa6e5cb579ad943ecfc162521b9c1" Nov 24 09:59:45 crc kubenswrapper[4563]: I1124 09:59:45.464833 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h7gkj/must-gather-v87wf" Nov 24 09:59:45 crc kubenswrapper[4563]: I1124 09:59:45.479613 4563 scope.go:117] "RemoveContainer" containerID="7c4f9ff8c22199340577363c0c05b7c2015f0a43cf29aec58e24be79a519a2f0" Nov 24 09:59:45 crc kubenswrapper[4563]: I1124 09:59:45.539378 4563 scope.go:117] "RemoveContainer" containerID="a4fb22a31c4810ea4f9a967db90b4ba0697fa6e5cb579ad943ecfc162521b9c1" Nov 24 09:59:45 crc kubenswrapper[4563]: E1124 09:59:45.539833 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4fb22a31c4810ea4f9a967db90b4ba0697fa6e5cb579ad943ecfc162521b9c1\": container with ID starting with a4fb22a31c4810ea4f9a967db90b4ba0697fa6e5cb579ad943ecfc162521b9c1 not found: ID does not exist" containerID="a4fb22a31c4810ea4f9a967db90b4ba0697fa6e5cb579ad943ecfc162521b9c1" Nov 24 09:59:45 crc kubenswrapper[4563]: I1124 09:59:45.539875 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4fb22a31c4810ea4f9a967db90b4ba0697fa6e5cb579ad943ecfc162521b9c1"} err="failed to get container status \"a4fb22a31c4810ea4f9a967db90b4ba0697fa6e5cb579ad943ecfc162521b9c1\": rpc error: code = NotFound desc = could not find container \"a4fb22a31c4810ea4f9a967db90b4ba0697fa6e5cb579ad943ecfc162521b9c1\": container with ID starting with a4fb22a31c4810ea4f9a967db90b4ba0697fa6e5cb579ad943ecfc162521b9c1 not found: ID does not exist" Nov 24 09:59:45 crc kubenswrapper[4563]: I1124 09:59:45.539902 4563 scope.go:117] "RemoveContainer" containerID="7c4f9ff8c22199340577363c0c05b7c2015f0a43cf29aec58e24be79a519a2f0" Nov 24 09:59:45 crc kubenswrapper[4563]: E1124 09:59:45.540298 4563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4f9ff8c22199340577363c0c05b7c2015f0a43cf29aec58e24be79a519a2f0\": container with ID starting with 7c4f9ff8c22199340577363c0c05b7c2015f0a43cf29aec58e24be79a519a2f0 not found: ID does not exist" containerID="7c4f9ff8c22199340577363c0c05b7c2015f0a43cf29aec58e24be79a519a2f0" Nov 24 09:59:45 crc kubenswrapper[4563]: I1124 09:59:45.540386 4563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4f9ff8c22199340577363c0c05b7c2015f0a43cf29aec58e24be79a519a2f0"} err="failed to get container status \"7c4f9ff8c22199340577363c0c05b7c2015f0a43cf29aec58e24be79a519a2f0\": rpc error: code = NotFound desc = could not find container \"7c4f9ff8c22199340577363c0c05b7c2015f0a43cf29aec58e24be79a519a2f0\": container with ID starting with 7c4f9ff8c22199340577363c0c05b7c2015f0a43cf29aec58e24be79a519a2f0 not found: ID does not exist" Nov 24 09:59:53 crc kubenswrapper[4563]: I1124 09:59:53.059090 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 09:59:53 crc kubenswrapper[4563]: E1124 09:59:53.060069 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.157566 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv"] Nov 24 10:00:00 crc kubenswrapper[4563]: E1124 10:00:00.158453 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7e2be0-612f-4055-a30d-843de4b14442" containerName="container-00" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.158467 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7e2be0-612f-4055-a30d-843de4b14442" containerName="container-00" Nov 24 10:00:00 crc kubenswrapper[4563]: E1124 10:00:00.158482 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73563124-8005-419b-ae79-70a52a25a823" containerName="copy" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.158487 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="73563124-8005-419b-ae79-70a52a25a823" containerName="copy" Nov 24 10:00:00 crc kubenswrapper[4563]: E1124 10:00:00.158498 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73563124-8005-419b-ae79-70a52a25a823" containerName="gather" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.158504 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="73563124-8005-419b-ae79-70a52a25a823" containerName="gather" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.158710 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a7e2be0-612f-4055-a30d-843de4b14442" containerName="container-00" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.158722 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="73563124-8005-419b-ae79-70a52a25a823" containerName="gather" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.158739 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="73563124-8005-419b-ae79-70a52a25a823" containerName="copy" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.159297 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.160864 4563 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.160899 4563 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.166266 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv"] Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.232936 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kphbk\" (UniqueName: \"kubernetes.io/projected/1edb0f25-6631-4aa5-826a-fc673ce26c7d-kube-api-access-kphbk\") pod \"collect-profiles-29399640-8c2nv\" (UID: \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.233037 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edb0f25-6631-4aa5-826a-fc673ce26c7d-secret-volume\") pod \"collect-profiles-29399640-8c2nv\" (UID: \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.233080 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edb0f25-6631-4aa5-826a-fc673ce26c7d-config-volume\") pod \"collect-profiles-29399640-8c2nv\" (UID: \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.333877 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edb0f25-6631-4aa5-826a-fc673ce26c7d-secret-volume\") pod \"collect-profiles-29399640-8c2nv\" (UID: \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.333930 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edb0f25-6631-4aa5-826a-fc673ce26c7d-config-volume\") pod \"collect-profiles-29399640-8c2nv\" (UID: \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.334015 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kphbk\" (UniqueName: \"kubernetes.io/projected/1edb0f25-6631-4aa5-826a-fc673ce26c7d-kube-api-access-kphbk\") pod \"collect-profiles-29399640-8c2nv\" (UID: \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.334877 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edb0f25-6631-4aa5-826a-fc673ce26c7d-config-volume\") pod \"collect-profiles-29399640-8c2nv\" (UID: \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.339615 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edb0f25-6631-4aa5-826a-fc673ce26c7d-secret-volume\") pod \"collect-profiles-29399640-8c2nv\" (UID: \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.348052 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kphbk\" (UniqueName: \"kubernetes.io/projected/1edb0f25-6631-4aa5-826a-fc673ce26c7d-kube-api-access-kphbk\") pod \"collect-profiles-29399640-8c2nv\" (UID: \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.477485 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" Nov 24 10:00:00 crc kubenswrapper[4563]: I1124 10:00:00.852496 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv"] Nov 24 10:00:01 crc kubenswrapper[4563]: I1124 10:00:01.583970 4563 generic.go:334] "Generic (PLEG): container finished" podID="1edb0f25-6631-4aa5-826a-fc673ce26c7d" containerID="55dd0dcea11950384b5fc48a9e9fa4b46b0f7a90ab1cf96d5845febef1359aa0" exitCode=0 Nov 24 10:00:01 crc kubenswrapper[4563]: I1124 10:00:01.584059 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" event={"ID":"1edb0f25-6631-4aa5-826a-fc673ce26c7d","Type":"ContainerDied","Data":"55dd0dcea11950384b5fc48a9e9fa4b46b0f7a90ab1cf96d5845febef1359aa0"} Nov 24 10:00:01 crc kubenswrapper[4563]: I1124 10:00:01.584294 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" event={"ID":"1edb0f25-6631-4aa5-826a-fc673ce26c7d","Type":"ContainerStarted","Data":"7c57e7af339a09424030ae7dc6c6e8f04b461199327a7568bee0e563e44c303f"} Nov 24 10:00:02 crc kubenswrapper[4563]: I1124 10:00:02.829397 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" Nov 24 10:00:02 crc kubenswrapper[4563]: I1124 10:00:02.983254 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edb0f25-6631-4aa5-826a-fc673ce26c7d-config-volume\") pod \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\" (UID: \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\") " Nov 24 10:00:02 crc kubenswrapper[4563]: I1124 10:00:02.983329 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kphbk\" (UniqueName: \"kubernetes.io/projected/1edb0f25-6631-4aa5-826a-fc673ce26c7d-kube-api-access-kphbk\") pod \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\" (UID: \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\") " Nov 24 10:00:02 crc kubenswrapper[4563]: I1124 10:00:02.983420 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edb0f25-6631-4aa5-826a-fc673ce26c7d-secret-volume\") pod \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\" (UID: \"1edb0f25-6631-4aa5-826a-fc673ce26c7d\") " Nov 24 10:00:02 crc kubenswrapper[4563]: I1124 10:00:02.984765 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1edb0f25-6631-4aa5-826a-fc673ce26c7d-config-volume" (OuterVolumeSpecName: "config-volume") pod "1edb0f25-6631-4aa5-826a-fc673ce26c7d" (UID: "1edb0f25-6631-4aa5-826a-fc673ce26c7d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 24 10:00:02 crc kubenswrapper[4563]: I1124 10:00:02.991321 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1edb0f25-6631-4aa5-826a-fc673ce26c7d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1edb0f25-6631-4aa5-826a-fc673ce26c7d" (UID: "1edb0f25-6631-4aa5-826a-fc673ce26c7d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 10:00:02 crc kubenswrapper[4563]: I1124 10:00:02.991358 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1edb0f25-6631-4aa5-826a-fc673ce26c7d-kube-api-access-kphbk" (OuterVolumeSpecName: "kube-api-access-kphbk") pod "1edb0f25-6631-4aa5-826a-fc673ce26c7d" (UID: "1edb0f25-6631-4aa5-826a-fc673ce26c7d"). InnerVolumeSpecName "kube-api-access-kphbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 10:00:03 crc kubenswrapper[4563]: I1124 10:00:03.085672 4563 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edb0f25-6631-4aa5-826a-fc673ce26c7d-config-volume\") on node \"crc\" DevicePath \"\"" Nov 24 10:00:03 crc kubenswrapper[4563]: I1124 10:00:03.085702 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kphbk\" (UniqueName: \"kubernetes.io/projected/1edb0f25-6631-4aa5-826a-fc673ce26c7d-kube-api-access-kphbk\") on node \"crc\" DevicePath \"\"" Nov 24 10:00:03 crc kubenswrapper[4563]: I1124 10:00:03.085713 4563 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edb0f25-6631-4aa5-826a-fc673ce26c7d-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 24 10:00:03 crc kubenswrapper[4563]: I1124 10:00:03.599120 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" event={"ID":"1edb0f25-6631-4aa5-826a-fc673ce26c7d","Type":"ContainerDied","Data":"7c57e7af339a09424030ae7dc6c6e8f04b461199327a7568bee0e563e44c303f"} Nov 24 10:00:03 crc kubenswrapper[4563]: I1124 10:00:03.599166 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c57e7af339a09424030ae7dc6c6e8f04b461199327a7568bee0e563e44c303f" Nov 24 10:00:03 crc kubenswrapper[4563]: I1124 10:00:03.599179 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29399640-8c2nv" Nov 24 10:00:03 crc kubenswrapper[4563]: I1124 10:00:03.870652 4563 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t"] Nov 24 10:00:03 crc kubenswrapper[4563]: I1124 10:00:03.876175 4563 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29399595-p4k4t"] Nov 24 10:00:05 crc kubenswrapper[4563]: I1124 10:00:05.065729 4563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92446a78-c0f0-433a-a410-9414ceb0a78d" path="/var/lib/kubelet/pods/92446a78-c0f0-433a-a410-9414ceb0a78d/volumes" Nov 24 10:00:07 crc kubenswrapper[4563]: I1124 10:00:07.054302 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 10:00:07 crc kubenswrapper[4563]: E1124 10:00:07.055041 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 10:00:19 crc kubenswrapper[4563]: I1124 10:00:19.054988 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 10:00:19 crc kubenswrapper[4563]: E1124 10:00:19.056292 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 10:00:30 crc kubenswrapper[4563]: I1124 10:00:30.055368 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 10:00:30 crc kubenswrapper[4563]: E1124 10:00:30.056074 4563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stlxr_openshift-machine-config-operator(3b2bfe55-8989-49b3-bb61-e28189447627)\"" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" podUID="3b2bfe55-8989-49b3-bb61-e28189447627" Nov 24 10:00:44 crc kubenswrapper[4563]: I1124 10:00:44.055335 4563 scope.go:117] "RemoveContainer" containerID="b48fdf0ba62b83c4f69e2a4bf9fd2a88b501712db1a82d9a8123c0b640f9f108" Nov 24 10:00:44 crc kubenswrapper[4563]: I1124 10:00:44.876201 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stlxr" event={"ID":"3b2bfe55-8989-49b3-bb61-e28189447627","Type":"ContainerStarted","Data":"915fcdee08266bdf094c92d73fa6e625d156b7c1586410ef5f188578afe10ea9"} Nov 24 10:00:53 crc kubenswrapper[4563]: I1124 10:00:53.135721 4563 scope.go:117] "RemoveContainer" containerID="56039f2995c9760adca7d3fd0030414b8abaccbe79f400d82a567adeff990325" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.147556 4563 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29399641-t9kgj"] Nov 24 10:01:00 crc kubenswrapper[4563]: E1124 10:01:00.148551 4563 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1edb0f25-6631-4aa5-826a-fc673ce26c7d" containerName="collect-profiles" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.148567 4563 state_mem.go:107] "Deleted CPUSet assignment" podUID="1edb0f25-6631-4aa5-826a-fc673ce26c7d" containerName="collect-profiles" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.148789 4563 memory_manager.go:354] "RemoveStaleState removing state" podUID="1edb0f25-6631-4aa5-826a-fc673ce26c7d" containerName="collect-profiles" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.149412 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.157824 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29399641-t9kgj"] Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.225242 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-fernet-keys\") pod \"keystone-cron-29399641-t9kgj\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.225321 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnrx4\" (UniqueName: \"kubernetes.io/projected/3973d5aa-7f77-403a-bfeb-a8b5c331923d-kube-api-access-wnrx4\") pod \"keystone-cron-29399641-t9kgj\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.225349 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-config-data\") pod \"keystone-cron-29399641-t9kgj\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.225367 4563 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-combined-ca-bundle\") pod \"keystone-cron-29399641-t9kgj\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.327242 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-fernet-keys\") pod \"keystone-cron-29399641-t9kgj\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.327479 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnrx4\" (UniqueName: \"kubernetes.io/projected/3973d5aa-7f77-403a-bfeb-a8b5c331923d-kube-api-access-wnrx4\") pod \"keystone-cron-29399641-t9kgj\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.327578 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-config-data\") pod \"keystone-cron-29399641-t9kgj\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.327722 4563 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-combined-ca-bundle\") pod \"keystone-cron-29399641-t9kgj\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.334019 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-fernet-keys\") pod \"keystone-cron-29399641-t9kgj\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.334062 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-config-data\") pod \"keystone-cron-29399641-t9kgj\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.335203 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-combined-ca-bundle\") pod \"keystone-cron-29399641-t9kgj\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.341299 4563 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnrx4\" (UniqueName: \"kubernetes.io/projected/3973d5aa-7f77-403a-bfeb-a8b5c331923d-kube-api-access-wnrx4\") pod \"keystone-cron-29399641-t9kgj\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.476487 4563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:00 crc kubenswrapper[4563]: I1124 10:01:00.841372 4563 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29399641-t9kgj"] Nov 24 10:01:01 crc kubenswrapper[4563]: I1124 10:01:01.002986 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29399641-t9kgj" event={"ID":"3973d5aa-7f77-403a-bfeb-a8b5c331923d","Type":"ContainerStarted","Data":"ba0876d5eb1ebb062d1b9fc78169d8bcbee5f72d6441549048824e0e7bc1278a"} Nov 24 10:01:01 crc kubenswrapper[4563]: I1124 10:01:01.003046 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29399641-t9kgj" event={"ID":"3973d5aa-7f77-403a-bfeb-a8b5c331923d","Type":"ContainerStarted","Data":"9312c4b48e22382aeb69ff76a5c7e17ac7393daa3132c2b68f1b5c3dbc3e4a8a"} Nov 24 10:01:01 crc kubenswrapper[4563]: I1124 10:01:01.023153 4563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29399641-t9kgj" podStartSLOduration=1.023129152 podStartE2EDuration="1.023129152s" podCreationTimestamp="2025-11-24 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 10:01:01.016668942 +0000 UTC m=+3438.275646389" watchObservedRunningTime="2025-11-24 10:01:01.023129152 +0000 UTC m=+3438.282106599" Nov 24 10:01:03 crc kubenswrapper[4563]: I1124 10:01:03.017623 4563 generic.go:334] "Generic (PLEG): container finished" podID="3973d5aa-7f77-403a-bfeb-a8b5c331923d" containerID="ba0876d5eb1ebb062d1b9fc78169d8bcbee5f72d6441549048824e0e7bc1278a" exitCode=0 Nov 24 10:01:03 crc kubenswrapper[4563]: I1124 10:01:03.017709 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29399641-t9kgj" event={"ID":"3973d5aa-7f77-403a-bfeb-a8b5c331923d","Type":"ContainerDied","Data":"ba0876d5eb1ebb062d1b9fc78169d8bcbee5f72d6441549048824e0e7bc1278a"} Nov 24 10:01:04 crc kubenswrapper[4563]: I1124 10:01:04.268520 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29399641-t9kgj" Nov 24 10:01:04 crc kubenswrapper[4563]: I1124 10:01:04.296446 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-config-data\") pod \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " Nov 24 10:01:04 crc kubenswrapper[4563]: I1124 10:01:04.296556 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-combined-ca-bundle\") pod \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " Nov 24 10:01:04 crc kubenswrapper[4563]: I1124 10:01:04.296583 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-fernet-keys\") pod \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " Nov 24 10:01:04 crc kubenswrapper[4563]: I1124 10:01:04.296611 4563 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnrx4\" (UniqueName: \"kubernetes.io/projected/3973d5aa-7f77-403a-bfeb-a8b5c331923d-kube-api-access-wnrx4\") pod \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\" (UID: \"3973d5aa-7f77-403a-bfeb-a8b5c331923d\") " Nov 24 10:01:04 crc kubenswrapper[4563]: I1124 10:01:04.301955 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3973d5aa-7f77-403a-bfeb-a8b5c331923d-kube-api-access-wnrx4" (OuterVolumeSpecName: "kube-api-access-wnrx4") pod "3973d5aa-7f77-403a-bfeb-a8b5c331923d" (UID: "3973d5aa-7f77-403a-bfeb-a8b5c331923d"). InnerVolumeSpecName "kube-api-access-wnrx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 24 10:01:04 crc kubenswrapper[4563]: I1124 10:01:04.302420 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3973d5aa-7f77-403a-bfeb-a8b5c331923d" (UID: "3973d5aa-7f77-403a-bfeb-a8b5c331923d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 10:01:04 crc kubenswrapper[4563]: I1124 10:01:04.318414 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3973d5aa-7f77-403a-bfeb-a8b5c331923d" (UID: "3973d5aa-7f77-403a-bfeb-a8b5c331923d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 10:01:04 crc kubenswrapper[4563]: I1124 10:01:04.334910 4563 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-config-data" (OuterVolumeSpecName: "config-data") pod "3973d5aa-7f77-403a-bfeb-a8b5c331923d" (UID: "3973d5aa-7f77-403a-bfeb-a8b5c331923d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 24 10:01:04 crc kubenswrapper[4563]: I1124 10:01:04.398609 4563 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-config-data\") on node \"crc\" DevicePath \"\"" Nov 24 10:01:04 crc kubenswrapper[4563]: I1124 10:01:04.398734 4563 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 24 10:01:04 crc kubenswrapper[4563]: I1124 10:01:04.398797 4563 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3973d5aa-7f77-403a-bfeb-a8b5c331923d-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 24 10:01:04 crc kubenswrapper[4563]: I1124 10:01:04.398856 4563 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnrx4\" (UniqueName: \"kubernetes.io/projected/3973d5aa-7f77-403a-bfeb-a8b5c331923d-kube-api-access-wnrx4\") on node \"crc\" DevicePath \"\"" Nov 24 10:01:05 crc kubenswrapper[4563]: I1124 10:01:05.032416 4563 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29399641-t9kgj" event={"ID":"3973d5aa-7f77-403a-bfeb-a8b5c331923d","Type":"ContainerDied","Data":"9312c4b48e22382aeb69ff76a5c7e17ac7393daa3132c2b68f1b5c3dbc3e4a8a"} Nov 24 10:01:05 crc kubenswrapper[4563]: I1124 10:01:05.032747 4563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9312c4b48e22382aeb69ff76a5c7e17ac7393daa3132c2b68f1b5c3dbc3e4a8a" Nov 24 10:01:05 crc kubenswrapper[4563]: I1124 10:01:05.032473 4563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29399641-t9kgj"